A tentativa de transformar homossexuais em héteros usando métodos “científicos” já tem mais de 150 anos. A medicina, segundo os pesquisadores James Naylor Green e Ronaldo Polito, já tentou de tudo para “curá-los”.
“Confinamento, choques elétricos, medicação pesada, tratamento psicológico ou psiquiátrico, psicanálise individual, de grupo e familiar, camisa de força, transplante de testículos, eis aí algumas das “técnicas” de intervenção no corpo e na mente dos homens que preferem se relacionar afetiva e sexualmente com outros homens”, contam em “Frescos Trópicos”.
No livro, os autores examinam o período entre as décadas de 1870 e 1980, fundamentados em informações publicadas nessa época. Abaixo, leia trecho de “Frescos Trópicos”.
“Pode-se dizer que a medicina, nos últimos 150 anos, já tentou ou propôs de tudo para a “cura” dos homossexuais. Confinamento, choques elétricos, medicação pesada, tratamento psicológico ou psiquiátrico, psicanálise individual, de grupo e familiar, camisa de força, transplante de testículos, eis aí algumas das “técnicas” de intervenção no corpo e na mente dos homens que preferem se relacionar afetiva e sexualmente com outros homens.
Entre inúmeros exemplos do passado, citemos Pires de Almeira, em “Homossexualismo”, de 1906, que propõe um tratamento específico para os invertidos. Mas, primeiro, vamos entender o que ele chama de “invertido”: “é aquele que, de nascença, é já invertivo, e que, em toda a associação sexual, representa o papel de macho: é, pois, um macho mais macho, se se trata de um homem”. “Invertidos”, portanto, nascem homossexuais, diferentemente dos “pervertidos” que, segundo o autor, “depois de terem sido já sexuais normais, se tornaram invertidos por qualquer motivo”.
Para Pires de Almeida, o tratamento dos “pervertidos” é somente um pouco mais simples do que dos “invertidos”. Para estes ele recomenda, entre outros procedimentos:
“O invertido deveria ser acompanhado desde a infância, vigiado por uma espécie de tutor que, à feição de um aparelho ortopédico moral, fosse-lhe obstáculo ao desvio, trabalhando pertinentemente para que a consolidação se efetue em absoluto. (…)
Antes de tudo, devemos lembrar que tais desregramentos são puramente moléstias mentais; e, por isso, aconselharei, quando não tenhamos acompanhando o indivíduo desde a infância, e hajamos iniciado o tratamento em idade tardia, medicá-lo pela estética sugestiva; isto é, por meio do magnetismo e da sugestão combinados: bem orientar-lhe o espírito, dirigindo sua atenção para a beleza das formas femininas, cercá-lo de modelos célebres em pintura, na estatuária principalmente, e obrigá-lo à leitura de obras românticas em que tais belezas despertem as paixões tumultuosas. Facilitar-se-lhe-á o encontro com mulheres plasticamente sensuais, fáceis às carícias, graciosas, faceiras; não se hesitará até diante de certos subterfúrgios a princípio, tal como, por exemplo, o de provocar o coito do invertido com mulheres vestidas de homem; ou mesmo obrigá-lo a pernoitar com mulheres completamente nuas, ainda que não as goze.
Se, porém, existe, da parte do doente, repulsão invencível para as sociedades ambíguas, recorrer-se-á à convivência em outro meio: mulheres atraentes, sim, porém puras, puríssimas, virtuosas: o seio perfumado das famílias.”
“Frescos Trópicos” Autores: Ronald Polito, James Naylor Green Editora: José Olympio Páginas: 196 Quanto: R$ 23,90 (preço promocional*) Onde comprar: pelo telefone 0800-140090 ou pelo site da Livraria da Folha
“It’s as safe as Dawn dishwashing liquid.” That’s what Jamie Griffin says the BP man told her about the smelly, rainbow-streaked gunk coating the floor of the “floating hotel” where Griffin was feeding hundreds of cleanup workers during the BP oil disaster in the Gulf of Mexico. Apparently, the workers were tracking the gunk inside on their boots. Griffin, as chief cook and maid, was trying to clean it. But even boiling water didn’t work.
“The BP representative said, ‘Jamie, just mop it like you’d mop any other dirty floor,’” Griffin recalls in her Louisiana drawl.
It was the opening weeks of what everyone, echoing President Barack Obama, was calling “the worst environmental disaster in American history.” At 9:45 p.m. local time on April 20, 2010, a fiery explosion on the Deepwater Horizon oil rig had killed 11 workers and injured 17. One mile underwater, the Macondo well had blown apart, unleashing a gusher of oil into the gulf. At risk were fishing areas that supplied one third of the seafood consumed in the U.S., beaches from Texas to Florida that drew billions of dollars’ worth of tourism to local economies, and Obama’s chances of reelection. Republicans were blaming him for mishandling the disaster, his poll numbers were falling, even his 11-year-old daughter was demanding, “Daddy, did you plug the hole yet?”
Griffin did as she was told: “I tried Pine-Sol, bleach, I even tried Dawn on those floors.” As she scrubbed, the mix of cleanser and gunk occasionally splashed onto her arms and face.
Within days, the 32-year-old single mother was coughing up blood and suffering constant headaches. She lost her voice. “My throat felt like I’d swallowed razor blades,” she says.
Then things got much worse.
Like hundreds, possibly thousands, of workers on the cleanup, Griffin soon fell ill with a cluster of excruciating, bizarre, grotesque ailments. By July, unstoppable muscle spasms were twisting her hands into immovable claws. In August, she began losing her short-term memory. After cooking professionally for 10 years, she couldn’t remember the recipe for vegetable soup; one morning, she got in the car to go to work, only to discover she hadn’t put on pants. The right side, but only the right side, of her body “started acting crazy. It felt like the nerves were coming out of my skin. It was so painful. My right leg swelled—my ankle would get as wide as my calf—and my skin got incredibly itchy.”
“These are the same symptoms experienced by soldiers who returned from the Persian Gulf War with Gulf War syndrome,” says Dr. Michael Robichaux, a Louisiana physician and former state senator, who treated Griffin and 113 other patients with similar complaints. As a general practitioner, Robichaux says he had “never seen this grouping of symptoms together: skin problems, neurological impairments, plus pulmonary problems.” Only months later, after Kaye H. Kilburn, a former professor of medicine at the University of Southern California and one of the nation’s leading environmental health experts, came to Louisiana and tested 14 of Robichaux’s patients did the two physicians make the connection with Gulf War syndrome, the malady that afflicted an estimated 250,000 veterans of that war with a mysterious combination of fatigue, skin inflammation, and cognitive problems.
Meanwhile, the well kept hemorrhaging oil. The world watched with bated breath as BP failed in one attempt after another to stop the leak. An agonizing 87 days passed before the well was finally plugged on July 15. By then, 210 million gallons of Louisiana sweet crude had escaped into the Gulf of Mexico, according to government estimates, making the BP disaster the largest accidental oil leak in world history.
In 2010, Pulitzer Prize-winning animator Mark Fiore created this humorous and poignant take on the BP oil spill.
Yet three years later, the BP disaster has been largely forgotten, both overseas and in the U.S. Popular anger has cooled. The media have moved on. Today, only the business press offers serious coverage of what the Financial Times calls “the trial of the century”—the trial now under way in New Orleans, where BP faces tens of billions of dollars in potential penalties for the disaster. As for Obama, the same president who early in the BP crisis blasted the “scandalously close relationship” between oil companies and government regulators two years later ran for reelection boasting about how much new oil and gas development his administration had approved.
Such collective amnesia may seem surprising, but there may be a good explanation for it: BP mounted a cover-up that concealed the full extent of its crimes from public view. This cover-up prevented the media and therefore the public from knowing—and above all, seeing—just how much oil was gushing into the gulf. The disaster appeared much less extensive and destructive than it actually was. BP declined to comment for this article.
That BP lied about the amount of oil it discharged into the gulf is already established. Lying to Congress about that was one of 14 felonies to which BP pleaded guilty last year in a legal settlement with the Justice Department that included a $4.5 billion fine, the largest fine ever levied against a corporation in the U.S.
What has not been revealed until now is how BP hid that massive amount of oil from TV cameras and the price that this “disappearing act” imposed on cleanup workers, coastal residents, and the ecosystem of the gulf. That story can now be told because an anonymous whistleblower has provided evidence that BP was warned in advance about the safety risks of attempting to cover up its leaking oil. Nevertheless, BP proceeded. Furthermore, BP appears to have withheld these safety warnings, as well as protective measures, both from the thousands of workers hired for the cleanup and from the millions of Gulf Coast residents who stood to be affected.
The financial implications are enormous. The trial now under way in New Orleans is wrestling with whether BP was guilty of “negligence” or “gross negligence” for the Deepwater Horizon disaster. If found guilty of “negligence,” BP would be fined, under the Clean Water Act, $1,100 for each barrel of oil that leaked. But if found guilty of “gross negligence”—which a cover-up would seem to imply—BP would be fined $4,300 per barrel, almost four times as much, for a total of $17.5 billion. That large a fine, combined with an additional $34 billion that the states of Louisiana, Alabama, Mississippi, and Florida are seeking, could have a powerful effect on BP’s economic health.
Yet the most astonishing thing about BP’s cover-up? It was carried out in plain sight, right in front of the world’s uncomprehending news media (including, I regret to say, this reporter).
The chief instrument of BP’s cover-up was the same substance that apparently sickened Jamie Griffin and countless other cleanup workers and local residents. Its brand name is Corexit, but most news reports at the time referred to it simply as a “dispersant.” Its function was to attach itself to leaked oil, break it into droplets, and disperse them into the vast reaches of the gulf, thereby keeping the oil from reaching Gulf Coast shorelines. And the Corexit did largely achieve this goal.
But the 1.84 million gallons of Corexit that BP applied during the cleanup also served a public-relations purpose: they made the oil spill all but disappear, at least from TV screens. By late July 2010, the Associated Press and The New York Times were questioning whether the spill had been such a big deal after all. Time went so far as to assert that right-wing talk-radio host Rush Limbaugh “has a point” when he accused journalists and environmentalists of exaggerating the crisis.
But BP had a problem: it had lied about how safe Corexit is, and proof of its dishonesty would eventually fall into the hands of the Government Accountability Project, the premiere whistleblower-protection group in the U.S. The proof? A technical manual BP had received from NALCO, the firm that supplied the Corexit that BP used in the gulf.
An electronic copy of that manual is included in a new report GAPhas issued, “Deadly Dispersants in the Gulf.” On the basis of interviews with dozens of cleanup workers, scientists, and Gulf Coast residents, GAP concludes that the health impacts endured by Griffin were visited upon many other locals as well. What’s more, the combination of Corexit and crude oil also caused terrible damage to gulf wildlife and ecosystems, including an unprecedented number of seafood mutations; declines of up to 80 percent in seafood catch; and massive die-offs of the microscopic life-forms at the base of the marine food chain. GAP warns that BP and the U.S. government nevertheless appear poised to repeat the exercise after the next major oil spill: “As a result of Corexit’s perceived success, Corexit … has become the dispersant of choice in the U.S. to ‘clean up’ oil spills.”
BP’s cover-up was not planned in advance but devised in the heat of the moment as the oil giant scrambled to limit the PR and other damages of the disaster. Indeed, one of the chief scandals of the disaster is just how unprepared both BP and federal and state authorities were for an oil leak of this magnitude. U.S. law required that a response plan be in place before drilling began, but the plan was embarrassingly flawed.
“We weren’t managing for actual risk; we were checking a box,” says Mark Davis, director of the Institute on Water Resources Law and Policy at Tulane University. “That’s how we ended up with a response plan that included provisions for dealing with the impacts to walruses: because [BP] copied word for word the response plans that had been developed after the Exxon-Valdez oil spill [in Alaska, in 1989] instead of a plan tailored to the conditions in the gulf.”
As days turned into weeks and it became obvious that no one knew how to plug the gushing well, BP began insisting that Corexit be used to disperse the leaking oil. This triggered alarms from scientists and from a leading environmental NGO in Louisiana, the Louisiana Environmental Action Network (LEAN).
The group’s scientific adviser, Wilma Subra, a chemist whose work on environmental pollution had won her a “genius grant” from the MacArthur Foundation, told state and federal authorities that she was especially concerned about how dangerous the mixture of crude and Corexit was: “The short-term health symptoms include acute respiratory problems, skin rashes, cardiovascular impacts, gastrointestinal impacts, and short-term loss of memory,” she told GAP investigators. “Long-term impacts include cancer, decreased lung function, liver damage, and kidney damage.”
(Nineteen months after the Deepwater Horizon explosion, a scientific study published in the peer-reviewed journal Environmental Pollution found that crude oil becomes 52 times more toxic when combined with Corexit.)
BP even rebuffed a direct request from the administrator of the Environmental Protection Agency, Lisa Jackson, who wrote BP a letter on May 19, asking the company to deploy a less toxic dispersant in the cleanup. Jackson could only ask BP to do this; she could not legally require it. Why? Because use of Corexit had been authorized years before under the federal Oil Pollution Act.
In a recent interview, Jackson explains that she and other officials “had to determine, with less-than-perfect scientific testing and data, whether use of dispersants would, despite potential side effects, improve the overall situation in the gulf and coastal ecosystems. The tradeoff, as I have said many times, was potential damage in the deep water versus the potential for larger amounts of undispersed oil in the ecologically rich coastal shallows and estuaries.” She adds that the presidential commission that later studied the BP oil disaster did not fault the decision to use dispersants.
Knowing that EPA lacked the authority to stop it, BP wrote back to Jackson on May 20, declaring that Corexit was safe. What’s more, BP wrote, there was a ready supply of Corexit, which was not the case with alternative dispersants. (A NALCO plant was located just 30 miles west of New Orleans.)
But Corexit was decidedly not safe without taking proper precautions, as the manual BP got from NALCO spelled out in black and white. The “Vessel Captains Hazard Communication” resource manual, which GAP shared with me, looks innocuous enough. A three-ring binder with a black plastic cover, the manual contained 61 sheets, each wrapped in plastic, that detailed the scientific properties of the two types of Corexit that BP was buying, as well as their health hazards and recommended measures against those hazards.
BP applied two types of Corexit in the gulf. The first, Corexit 9527, was considerably more toxic. According to the NALCO manual, Corexit 9527 is an “eye and skin irritant. Repeated or excessive exposure … may cause injury to red blood cells (hemolysis), kidney or the liver.” The manual adds: “Excessive exposure may cause central nervous system effects, nausea, vomiting, anesthetic or narcotic effects.” It advises, “Do not get in eyes, on skin, on clothing,” and “Wear suitable protective clothing.”
When available supplies of Corexit 9527 were exhausted early in the cleanup, BP switched to the second type of dispersant, Corexit 9500. In its recommendations for dealing with Corexit 9500, the NALCO manual advised, “Do not get in eyes, on skin, on clothing,” “Avoid breathing vapor,” and “Wear suitable protective clothing.”
It’s standard procedure—and required by U.S. law—for companies to distribute this kind of information to any work site where hazardous materials are present so workers can know about the dangers they face and how to protect themselves. But interviews with numerous cleanup workers suggest that this legally required precaution was rarely if ever followed during the BP cleanup. Instead, it appears that BP told NALCO to stop including the manuals with the Corexit that NALCO was delivering to cleanup work sites.
“It’s my understanding that some manuals were sent out with the shipments of Corexit in the beginning [of the cleanup],” the anonymous source tells me. “Then, BP told NALCO to stop sending them. So NALCO was left with a roomful of unused binders.”
Roman Blahoski, NALCO’s director of global communications, says: “NALCO responded to requests for its pre-approved dispersants from those charged with protecting the gulf and mitigating the environmental, health, and economic impact of this event. NALCO was never involved in decisions relating to the use, volume, and application of its dispersant.”
Misrepresenting the safety of Corexit went hand in hand with BP’s previously noted lie about how much oil was leaking from the Macondo well. As reported by John Rudolf in The Huffington Post, internal BP emails show that BP privately estimated that “the runaway well could be leaking from 62,000 barrels a day to 146,000 barrels a day.” Meanwhile, BP officials were telling the government and the media that only 5,000 barrels a day were leaking.
In short, applying Corexit enabled BP to mask the fact that a much larger amount of oil was actually leaking into the gulf. “Like any good magician, the oil industry has learned that if you can’t see something that was there, it must have ‘disappeared,’” Scott Porter, a scientist and deep-sea diver who consults for oil companies and oystermen, says in the GAP report. “Oil companies have also learned that, in the public mind, ‘out of sight equals out of mind.’ Therefore, they have chosen crude oil dispersants as the primary tool for handling large marine oil spills.”
BP also had a more direct financial interest in using Corexit, argues Clint Guidry, president of the Louisiana Shrimp Association, whose members include not only shrimpers but fishermen of all sorts. As it happens, local fishermen constituted a significant portion of BP’s cleanup force (which numbered as many as 47,000 workers at the height of the cleanup). Because the spill caused the closure of their fishing grounds, BP and state and federal authorities established the Vessels of Opportunity (VoO) program, in which BP paid fishermen to take their boats out and skim, burn, and otherwise get rid of leaked oil. Applying dispersants, Guidry points out, reduced the total volume of oil that could be traced back to BP.
“The next phase of this trial [against BP] is going to turn on how much oil was leaked,” Guidry tells me. [If found guilty, BP will be fined a certain amount for each barrel of oil judged to have leaked.] “So hiding the oil with Corexit worked not only to hide the size of the spill but also to lower the amount of oil that BP may get charged for releasing.”
Not only did BP fail to inform workers of the potential hazards of Corexit and to provide them with safety training and protective gear, according to interviews with dozens of cleanup workers, the company also allegedly threatened to fire workers who complained about the lack of respirators and protective clothing.
“I worked with probably a couple hundred different fishermen on the [cleanup],” Acy Cooper, Guidry’s second in command, tells me in Venice, the coastal town from which many VoO vessels departed. “Not one of them got any safety information or training concerning the toxic materials they encountered.” Cooper says that BP did provide workers with body suits and gloves designed for handling hazardous materials. “But when I’d talk with [the BP representative] about getting my guys respirators and air monitors, I’d never get any response.”
Roughly 58 percent of the 1.84 million gallons of Corexit used in the cleanup was sprayed onto the gulf from C-130 airplanes. The spray sometimes ended up hitting cleanup workers in the face.
“Our boat was sprayed four times,” says Jorey Danos, a 32-year-old father of three who suffered racking coughing fits, severe fatigue, and memory loss after working on the BP cleanup. “I could see the stuff coming out of the plane—like a shower of mist, a smoky color. I could see [it] coming at me, but there was nothing I could do.”
“The next day,” Danos continues, “when the BP rep came around on his speed boat, I asked, ‘Hey, what’s the deal with that stuff that was coming out of those planes yesterday?’ He told me, ‘Don’t worry about it.’ I said, ‘Man, that s–t was burning my face—it ain’t right.’ He said, ‘Don’t worry about it.’ I said, ‘Well, could we get some respirators or something, because that s–t is bad.’ He said, ‘No, that wouldn’t look good to the media. You got two choices: you can either be relieved of your duties or you can deal with it.’”
Perhaps the single most hazardous chemical compound found in Corexit 9527 is 2-Butoxyethanol, a substance that had been linked to cancers and other health impacts among cleanup workers on the 1989 Exxon-Valdez oil spill in Alaska. According to BP’s own data, 20 percent of offshore workers in the gulf had levels of 2-Butoxyethanol two times higher than the level certified as safe by the Occupational Safety and Health Administration.
Cleanup workers were not the only victims; coastal residents also suffered. “My 2-year-old grandson and I would play out in the yard,” says Shirley Tillman of the Mississippi coastal town Pass Christian. “You could smell oil and stuff in the air, but on the news they were saying it’s fine, don’t worry. Well, by October, he was one sick little fellow. All of a sudden, this very active little 2-year-old was constantly sick. He was having headaches, upper respiratory infections, earaches. The night of his birthday party, his parents had to rush him to the emergency room. He went to nine different doctors, but they treated just the symptoms; they’re not toxicologists.”
“It’s not the crime, it’s the cover-up.” Ever since the Watergate scandal of the 1970s, that’s been the mantra. Cover-ups don’t work, goes the argument. They only dig a deeper hole, because the truth eventually comes out.
But does it?
GAP investigators were hopeful that obtaining the NALCO manual might persuade BP to meet with them, and it did. On July 10, 2012, BP hosted a private meeting at its Houston offices. Presiding over the meeting, which is described here publicly for the first time, was BP’s public ombudsman, Stanley Sporkin, joining by telephone from Washington. Ironically, Sporkin had made his professional reputation during the Watergate scandal. As a lawyer with the Securities and Exchange Commission, Sporkin investigated illegal corporate payments to the slush fund that President Nixon used to buy the silence of the Watergate burglars.
Also attending the meeting were two senior BP attorneys; BP Vice President Luke Keller; other BP officials; Thomas Devine, GAP’s senior attorney on the BP case; Shanna Devine, GAP’s investigator on the case; Dr. Michael Robichaux; Dr. Wilma Subra; and Marylee Orr, the executive director of LEAN. The following account is based on my interviews with Thomas Devine, Robichaux, Subra, and Orr. BP declined to comment.
BP officials had previously confirmed the authenticity of the NALCO manual, says Thomas Devine, but now they refused to discuss it, even though this had been one of the stated purposes for the meeting. Nor would BP address the allegation, made by the whistleblower who had given the manual to GAP, that BP had ordered the manual withheld from cleanup work sites, perhaps to maintain the fiction that Corexit was safe.
“They opened the meeting with this upbeat presentation about how seriously they took their responsibilities for the spill and all the wonderful things they were doing to make things right,” says Devine. “When it was my turn to speak, I said that the manual our whistleblower had provided contradicted what they just said. I asked whether they had ordered the manual withdrawn from work sites. Their attorneys said that was a matter they would not discuss because of the pending litigation on the spill.” [Disclosure: Thomas Devine is a friend of this reporter.]
The visitors’ top priority was to get BP to agree not to use Corexit in the future. Keller said that Corexit was still authorized for use by the U.S. government and BP would indeed feel free to use it against any future oil spills.
A second priority was to get BP to provide medical treatment for Jamie Griffin and the many other apparent victims of Corexit-and-crude poisoning. This request too was refused by BP.
Robichaux doubts his patients will receive proper compensation from the $7.8 billion settlement BP reached in 2012 with the Plaintiffs’ Steering Committee, 19 court-appointed attorneys who represent the hundreds of individuals and entities that have sued BP for damages related to the gulf disaster. “Nine of the most common symptoms of my patients do not appear on the list of illnesses that settlement says can be compensated, including memory loss, fatigue, and joint and muscular pain,” says Robichaux. “So how are the attorneys going to file suits on behalf of those victims?”
At one level, BP’s cover-up of the gulf oil disaster speaks to the enormous power that giant corporations exercise in modern society, and how unable, or unwilling, governments are to limit that power. To be sure, BP has not entirely escaped censure for its actions; depending on the outcome of the trial now under way in New Orleans, the company could end up paying tens of billions of dollars in fines and damages over and above the $4.5 billion imposed by the Justice Department in the settlement last year. But BP’s reputation appears to have survived: its market value as this article went to press was a tidy $132 billion, and few, if any, BP officials appear likely to face any legal repercussions. “If I would have killed 11 people, I’d be hanging from a noose,” says Jorey Danos. “Not BP. It’s the golden rule: the man with the gold makes the rules.”
As unchastened as anyone at BP is Bob Dudley, the American who was catapulted into the CEO job a few weeks into the gulf disaster to replace Tony Hayward, whose propensity for imprudent comments—“I want my life back,” the multimillionaire had pouted while thousands of gulf workers and residents were suffering—had made him a globally derided figure. Dudley told the annual BP shareholders meeting in London last week that Corexit “is effectively … dishwashing soap,” no more toxic than that, as all scientific studies supposedly showed. What’s more, Dudley added, he himself had grown up in Mississippi and knows that the Gulf of Mexico is “an ecosystem that is used to oil.”
Nor has the BP oil disaster triggered the kind of changes in law and public priorities one might have expected. “Not much has actually changed,” says Mark Davis of Tulane. “It reflects just how wedded our country is to keeping the Gulf of Mexico producing oil and bringing it to our shores as cheaply as possible. Going forward, no one should assume that just because something really bad happened we’re going to manage oil and gas production with greater sensitivity and wisdom. That will only happen if people get involved and compel both the industry and the government to be more diligent.”
And so the worst environmental disaster in U.S. history has been whitewashed—its true dimensions obscured, its victims forgotten, its lessons ignored. Who says cover-ups never work?
Mark Hertsgaard is a fellow at the New American Foundation and the author, most recently, of HOT: Living Through the Next Fifty Years on Earth. This article was reported in partnership with the Investigative Fund at the Nation Institute.
Apr. 19, 2013 — Mathematical prediction models are better than doctors at predicting the outcomes and responses of lung cancer patients to treatment, according to new research presented today (Saturday) at the 2nd Forum of the European Society for Radiotherapy and Oncology (ESTRO).
These differences apply even after the doctor has seen the patient, which can provide extra information, and knows what the treatment plan and radiation dose will be.
“The number of treatment options available for lung cancer patients are increasing, as well as the amount of information available to the individual patient. It is evident that this will complicate the task of the doctor in the future,” said the presenter, Dr Cary Oberije, a postdoctoral researcher at the MAASTRO Clinic, Maastricht University Medical Center, Maastricht, The Netherlands. “If models based on patient, tumour and treatment characteristics already out-perform the doctors, then it is unethical to make treatment decisions based solely on the doctors’ opinions. We believe models should be implemented in clinical practice to guide decisions.”
Dr Oberije and her colleagues in The Netherlands used mathematical prediction models that had already been tested and published. The models use information from previous patients to create a statistical formula that can be used to predict the probability of outcome and responses to treatment using radiotherapy with or without chemotherapy for future patients.
Having obtained predictions from the mathematical models, the researchers asked experienced radiation oncologists to predict the likelihood of lung cancer patients surviving for two years, or suffering from shortness of breath (dyspnea) and difficulty swallowing (dysphagia) at two points in time:
1) after they had seen the patient for the first time, and
2) after the treatment plan was made. At the first time point, the doctors predicted two-year survival for 121 patients, dyspnea for 139 and dysphagia for 146 patients.
At the second time point, predictions were only available for 35, 39 and 41 patients respectively.
For all three predictions and at both time points, the mathematical models substantially outperformed the doctors’ predictions, with the doctors’ predictions being little better than those expected by chance.
The researchers plotted the results on a special graph  on which the area below the plotted line is used for measuring the accuracy of predictions; 1 represents a perfect prediction, while 0.5 represents predictions that were right in 50% of cases, i.e. the same as chance. They found that the model predictions at the first time point were 0.71 for two-year survival, 0.76 for dyspnea and 0.72 for dysphagia. In contrast, the doctors’ predictions were 0.56, 0.59 and 0.52 respectively.
The models had a better positive predictive value (PPV) — a measure of the proportion of patients who were correctly assessed as being at risk of dying within two years or suffering from dyspnea and dysphagia — than the doctors. The negative predictive value (NPV) — a measure of the proportion of patients that would not die within two years or suffer from dyspnea and dysphagia — was comparable between the models and the doctors.
“This indicates that the models were better at identifying high risk patients that have a very low chance of surviving or a very high chance of developing severe dyspnea or dysphagia,” said Dr Oberije.
The researchers say that it is important that further research is carried out into how prediction models can be integrated into standard clinical care. In addition, further improvement of the models by incorporating all the latest advances in areas such as genetics, imaging and other factors, is important. This will make it possible to tailor treatment to the individual patient’s biological make-up and tumour type
“In our opinion, individualised treatment can only succeed if prediction models are used in clinical practice. We have shown that current models already outperform doctors. Therefore, this study can be used as a strong argument in favour of using prediction models and changing current clinical practice,” said Dr Oberije.
“Correct prediction of outcomes is important for several reasons,” she continued. “First, it offers the possibility to discuss treatment options with patients. If survival chances are very low, some patients might opt for a less aggressive treatment with fewer side-effects and better quality of life. Second, it could be used to assess which patients are eligible for a specific clinical trial. Third, correct predictions make it possible to improve and optimise the treatment. Currently, treatment guidelines are applied to the whole lung cancer population, but we know that some patients are cured while others are not and some patients suffer from severe side-effects while others don’t. We know that there are many factors that play a role in the prognosis of patients and prediction models can combine them all.”
At present, prediction models are not used as widely as they could be by doctors. Dr Oberije says there are a number of reasons: some models lack clinical credibility; others have not yet been tested; the models need to be available and easy to use by doctors; and many doctors still think that seeing a patient gives them information that cannot be captured in a model. “Our study shows that it is very unlikely that a doctor can outperform a model,” she concluded.
President of ESTRO, Professor Vincenzo Valentini, a radiation oncologist at the Policlinico Universitario A. Gemelli, Rome, Italy, commented: “The booming growth of biological, imaging and clinical information will challenge the decision capacity of every oncologist. The understanding of the knowledge management sciences is becoming a priority for radiation oncologists in order for them to tailor their choices to cure and care for individual patients.”
 For the mathematicians among you, the graph is known as an Area Under the Curve (AUC) of the Receiver Operating Characteristic (ROC).
 This work was partially funded by grants from the Dutch Cancer Society (KWF), the European Fund for Regional Development (INTERREG/EFRO), and the Center for Translational Molecular Medicine (CTMM).
Apr. 15, 2013 — Mathematical estimates of treatment outcomes can cut costs and provide faster delivery of preventative measures.
South Africa is home to the largest HIV epidemic in the world with a total of 5.6 million people living with HIV. Large-scale clinical trials evaluating combination methods of prevention and treatment are often prohibitively expensive and take years to complete. In the absence of such trials, mathematical models can help assess the effectiveness of different HIV intervention combinations, as demonstrated in a new study by Elisa Long and Robert Stavert from Yale University in the US. Their findings appear in the Journal of General Internal Medicine, published by Springer.
Currently 60 percent of individuals in need of treatment for HIV in South Africa do not receive it. The allocation of scant resources to fight the HIV epidemic means each strategy must be measured in terms of cost versus benefit. A number of new clinical trials have presented evidence supporting a range of biomedical interventions that reduce transmission of HIV. These include voluntary male circumcision — now recommended by the World Health Organization and Joint United Nations Programme on HIV/AIDS as a preventive strategy — as well as vaginal microbicides and oral pre-exposure prophylaxis, all of which confer only partial protection against HIV. Long and Stavert show that a combination portfolio of multiple interventions could not only prevent up to two-thirds of future HIV infections, but is also cost-effective in a resource-limited setting such as South Africa.
The authors developed a mathematical model accounting for disease progression, mortality, morbidity and the heterosexual transmission of HIV to help forecast future trends in the disease. Using data specific for South Africa, the authors estimated the health benefits and cost-effectiveness of a “combination approach” using all three of the above methods in tandem with current levels of antiretroviral therapy, screening and counseling.
For each intervention, they calculated the HIV incidence and prevalence over 10 years. At present rates of screening and treatment, the researchers predict that HIV prevalence will decline from 19 percent to 14 percent of the population in the next 10 years. However, they calculate that their combination approach including male circumcision, vaginal microbicides and oral pre-exposure prophylaxis could further reduce HIV prevalence to 10 percent over that time scale — preventing 1.5 million HIV infection over 10 years — even if screening and antiretroviral therapy are kept at current levels. Increasing antiretroviral therapy use and HIV screening frequency in addition could avert more than 2 million HIV infections over 10 years, or 60 percent of the projected total.
The researchers also determined a hierarchy of effectiveness versus cost for these intervention strategies. Where budgets are limited, they suggest money should be allocated first to increasing male circumcision, then to more frequent HIV screening, use of vaginal microbicides and increasing antiretroviral therapy. Additionally, they calculate that omitting pre-exposure prophylaxis from their combination strategy could offer 90 percent of the benefits of treatment for less than 25 percent of the costs.
The authors conclude: “In the absence of multi-intervention randomized clinical or observational trials, a mathematical HIV epidemic model provides useful insights about the aggregate benefit of implementing a portfolio of biomedical, diagnostic and treatment programs. Allocating limited available resources for HIV control in South Africa is a key priority, and our study indicates that a multi-intervention HIV portfolio could avert nearly two-thirds of projected new HIV infections, and is a cost-effective use of resources.”
Long, E.F. and Stavert, R.R. Portfolios of biomedical HIV interventions in South Africa: a cost-effectiveness analysis. Journal of General Internal Medicine, 2013 DOI:10.1007/s11606-013-2417-1
Sep. 20, 2012 — Mathematical modeling being tested by researchers at the School of Science at Indiana University-Purdue University Indianapolis (IUPUI) and the IU School of Medicine has the potential to impact the knowledge and treatment of several diseases that continue to challenge scientists across the world.
Mathematical modeling allows researchers to closely mirror patient data, which is helpful in determining the cause and effect of certain risk factors. (Credit: Image courtesy of Indiana University-Purdue University Indianapolis School of Science)
The National Science Foundation recently recognized the work led by Drs. Giovanna Guidoboni, associate professor of mathematics in the School of Science, and Alon Harris, professor of ophthalmology and director of clinical research at the Eugene and Marilyn Glick Eye Institute, for its new approach to understanding what actually causes debilitating diseases like glaucoma. Their research could translate to more efficient treatments for diseases like diabetes and hypertension as well.
Glaucoma is the second-leading cause of blindness in the world, yet the only primary form of treatment is to reduce pressure in the patient’s eye. However, as many as one-third of the glaucoma patients have no elevated eye pressure, and the current inability to better understand what risk factors led to the disease can hinder treatment options.
Mathematical modeling, which creates an abstract model using mathematical language to describe the behavior of a system, allows doctors to better measure things like blood flow and oxygen levels in fine detail in the eye, the easiest human organ to study without invasive procedures. Models also can be used to estimate what cannot be measured directly, such as the pressure in the ocular vessels.
Through simulations, the mathematical model can help doctors determine the cause and effect of reduced blood flow, cell death and ocular pressure and how those risk factors affect one another in the presence of glaucoma. A better understanding of these factors — and the ability to accurately measure their interaction — could greatly improve doctors’ ability to treat the root causes of disease, Harris said.
“This is a unique, fresh approach to research and treatment,” Harris said. “We’re talking about the ability to identify tailor-made treatments for individual patients for diseases that are multi-factorial and where it’s difficult to isolate the path and physicality of the disease.”
Harris and Guidoboni have worked together for the past 18 months on the project. Dr. Julia Arciero, assistant professor of mathematical sciences at IUPUI, is a principle investigator on the project as well with expertise in mathematical modeling of blood flow.
The preliminary findings have been published in the British Journal of Ophthalmology and the research currently is under review in the Journal of Mathematical Biosciences and Engineering and the European Journal of Ophthalmology. The NSF recognized their work on Aug. 30 with a three-year grant to continue their research.
The pair also presented their findings at the 2012 annual meeting of the Association for Research in Vision and Ophthalmology (ARVO). Harris suggested that, out of the 12,000 ARVO participants, their group might have been the only research group to include mathematicians, which speaks highly of the cross-disciplinary collaboration occurring regularly at IUPUI.
“We approached this as a pure math question, where you try to solve a certain problem with the data you have,” said Guidoboni, co-director of the School of Science Institute for Mathematical Modeling and Computational Science (iM2CS) at IUPUI, a research center dedicated to using modeling methods to solve problems in medicine, the environment and computer science.
Guidoboni has expertise in applied mathematics. She also has a background in engineering, which she said helps her to approach medical research from a tactical standpoint where the data and feedback determine the model. She previously used modeling to better understand blood flow from the heart.
Harris said the potential impact has created quite a stir in the ocular research community.
“The response among our peers has been unheard of. The scientific community has been accepting of this new method and they are embracing it,” Harris added.
The group will seek additional research funding through the National Institute of Health, The Glaucoma Foundation and other medical entities that might benefit from the research. The initial success of their collaboration should lead to more cross-disciplinary projects in the future, Guidoboni said.
Also contributing are graduate students in mathematics, Lucia Carichino and Simone Cassani, and researchers in the department of ophthalmology, including Drs. Brent Siesky, Annahita Amireskandari and Leslie Tobe.
Reuters | By Kate KellandPosted: 03/10/2013 11:10 pm
LONDON, March 11 (Reuters) – Antibiotic resistance poses a catastrophic threat to medicine and could mean patients having minor surgery risk dying from infections that can no longer be treated, Britain’s top health official said on Monday.
Sally Davies, the chief medical officer for England, said global action is needed to fight antibiotic, or antimicrobial, resistance and fill a drug “discovery void” by researching and developing new medicines to treat emerging, mutating infections.
Only a handful of new antibiotics have been developed and brought to market in the past few decades, and it is a race against time to find more, as bacterial infections increasingly evolve into “superbugs” resistant to existing drugs.
“Antimicrobial resistance poses a catastrophic threat. If we don’t act now, any one of us could go into hospital in 20 years for minor surgery and die because of an ordinary infection that can’t be treated by antibiotics,” Davies told reporters as she published a report on infectious disease.
“And routine operations like hip replacements or organ transplants could be deadly because of the risk of infection.”
One of the best known superbugs, MRSA, is alone estimated to kill around 19,000 people every year in the United States – far more than HIV and AIDS – and a similar number in Europe.
And others are spreading. Cases of totally drug resistant tuberculosis have appeared in recent years and a new wave of “super superbugs” with a mutation called NDM 1, which first emerged in India, has now turned up all over the world, from Britain to New Zealand.
Last year the WHO said untreatable superbug strains of gonorrhoea were spreading across the world.
Laura Piddock, a professor of microbiology at Birmingham University and director of the campaign group Antibiotic Action, welcomed Davies’ efforts to raise awareness of the problem.
“There are an increasing number of infections for which there are virtually no therapeutic options, and we desperately need new discovery, research and development,” she said.
Davies called on governments and organisations across the world, including the World Health Organisation and the G8, to take the threat seriously and work to encourage more innovation and investment into the development of antibiotics.
“Over the past two decades there has been a discovery void around antibiotics, meaning diseases have evolved faster than the drugs to treat them,” she said.
Davies called for more cooperation between the healthcare and pharmaceutical industries to preserve the existing arsenal of antibiotics, and more focus on developing new ones.
Increasing surveillance to keep track of drug-resistant superbugs, prescribing fewer antibiotics and making sure they are only prescribed when needed, and ensuring better hygiene to keep infections to a minimum were equally important, she said.
Nigel Brown, president of the Society for General Microbiology, agreed the issues demanded urgent action and said its members would work hard to better understand infectious diseases, reduce transmission of antibiotic resistance, and help develop new antibiotics.
“The techniques of microbiology and new developments such as synthetic biology will be crucial in achieving this,” he said. (Editing by Jason Webb)
[Curioso que tanto receio exista com relação à geoengenharia, e tão pouco direcionado a esse tipo de zooengenharia.]
Pesquisa da função intestinal de insetos aumenta o conhecimento da fisiologia desses animais e pode ajudar a criar métodos inovadores de combater doenças e controlar pragas da lavoura (estrutura 3D da catepsina L2)
Por Fábio Reynol
Agência FAPESP – Diversas enfermidades humanas, como dengue, doença de chagas e leishmaniose, e pragas que destroem lavouras de algodão, cana-de-açúcar e bananeira são problemas que têm como ponto comum o fato de serem provocadas por insetos.
Uma extensa pesquisa feita no Instituto de Química (IQ) da Universidade de São Paulo (USP) ampliou o conhecimento sobre diferentes insetos por meio de uma abordagem peculiar: a investigação da função intestinal. Com isso, abriu espaço para métodos inovadores de controle.
O projeto, coordenado por Walter Ribeiro Terra, professor titular do IQ-USP – com a professora Clelia Ferreira como investigadora principal e vice-coordenadora –, é uma continuação de Temáticos sobre o mesmo tema desenvolvidos desde 1991. O novo projeto teve início em 2012 com conclusão prevista para 2017.
Entre as principais descobertas do projeto concluído este ano foi a de que mosquitos hematófagos da ordem Díptera têm em comum tripsinas especiais, fundamentais para a digestão de proteínas. “Essa informação torna esse tipo de tripsina um possível alvo de controle para todos os mosquitos desse grupo”, disse Terra.
Trata-se de um alvo bastante relevante, uma vez que a ordem Díptera engloba os gênerosAnopheles, Aedes e Culex, os quais agrupam insetos vetores de importantes doenças como malária, febre amarela, dengue e filariose.
Segundo Terra, inibir a tripsina poderia ser um método eficaz de controle dessas doenças, uma vez que bloquearia o processo de digestão dos insetos. Para isso, o trabalho também envolveu a busca por inibidores químicos das enzimas encontradas.
O método utilizado foi o da modelagem computacional a partir de imagens tridimensionais dessas moléculas. Em um modelo digital em 3D da enzima a ser inibida são testadas virtualmente moléculas inibidoras que se encaixam no maior número possível de reentrâncias, ou sítios funcionais.
“Em quanto mais sítios funcionais o reagente atracar, mais forte será a ligação e mais eficiente será o inibidor”, disse Terra à Agência FAPESP, explicando que a modelagem molecular 3D é amplamente usada na indústria farmacêutica.
A enzima bloqueada não consegue se recombinar e cumprir sua função no processo de digestão, o de quebrar outras moléculas. Sem conseguir absorver os nutrientes de que precisam, os mosquitos morrem.
O estudo da fisiologia do barbeiro Rhodnius prolixus, vetor da doença de chagas, sempre foi difícil e a observação de sua função intestinal um obstáculo para os pesquisadores.
A equipe de Terra contornou o problema encontrando um inseto similar, o Dysdercus peruvianus, percevejo que ataca o algodão. Transcriptomas (partes do genoma que codificam proteínas) desse inseto mostraram detalhes que podem ser válidos também para o barbeiro, podendo gerar alvos de controle naquele inseto.
O agronegócio da cana-de-açúcar também poderá se beneficiar do estudo. A catepsina L, enzima digestiva típica de muitos besouros, foi isolada no Sphenophorus levis, besouro cuja fase larval ataca o sistema radicular da cana. Essa enzima foi clonada, expressa e caracterizada com substratos sintéticos e inibidores. A mesma enzima encontrada no Tenebrio molitor, besouro conhecido como bicho-da-farinha, teve sua estrutura tridimensional resolvida.
“O maior desafio em identificar a estrutura tridimensional é a cristalização da proteína, porque se ela não cristaliza não conseguimos obter o modelo”, disse Terra, esclarecendo que várias proteínas não conseguem formar cristais, inviabilizando a sua visualização tridimensional.
Estrutura do desenvolvimento
Uma estrutura particular do sistema intestinal dos insetos recebeu atenção especial no Projeto Temático conduzido no IQ-USP: a membrana peritrófica.
Em formato de um minúsculo tubo, sabe-se que seu papel está ligado à eficiência digestiva, porém suas funções ainda não são totalmente conhecidas pela ciência. Algumas dessas funções hipotéticas foram testadas em insetos modelos e descobriu-se que ela possui participação preponderante no desenvolvimento dos insetos.
Insetos cujas membranas peritróficas foram inibidas tiveram o seu desenvolvimento prejudicado. Ao mesmo tempo, algumas plantas possuem reagentes naturais que atacam essa membrana, o que as protege de serem devoradas por insetos. “Essas informações tornam essa estrutura um importante alvo para processos inovadores de controle”, observou Terra.
O Projeto Temático também promoveu avanços consideráveis no conhecimento da evolução das espécies. Além de possível alvo de controle das moscas domésticas, a enzima catepsina D também está presente em humanos e em outros animais que possuem sistemas digestivos muito ácidos voltados a processar alimentos ricos em bactérias.
“O interessante dessa descoberta foi constatar que a mesma adaptação evolutiva ocorreu duas vezes e de maneira independente na mosca e na espécie humana”, disse Terra.
Outro avanço importante foi sobre a morfofisiologia dos insetos. Um estudo com o percevejoPodisus nigrispinus, predador de outros insetos, mostrou que a então chamada digestão extraoral daquele inseto é uma dispersão dos tecidos da presa por ação de uma substância salivar. A digestão propriamente dita ocorre no interior do intestino do inseto.
A descoberta, publicada no Journal of Insect Physiology, provocou uma menção especial de um parecerista da revista. “Ele escreveu que a partir desse trabalho deve-se repensar os conceitos de digestão fora do corpo”, disse Terra, salientando que a equipe recebeu com muito orgulho esse reconhecimento.
O projeto ainda identificou a lisozima como uma enzima crítica na digestão de moscas que atacam frutas, a trealase é crucial para lagartas pragas de lavouras e as beta-glucanases, ausentes nos mamíferos, estão relacionadas à digestão e ao sistema imunológico de insetos. Todas elas são potenciais alvos de controle dos insetos envolvidos.
Mais de 1,3 mil citações
Os resultados dos quatro anos de estudos estão registrados em 20 publicações e quatro capítulos de livros e os trabalhos de laboratório do projeto foram citados 1.357 vezes na literatura científica mundial nesse período.
No âmbito do Projeto Temático foram desenvolvidas três dissertações de mestrado, seis teses de doutorado e duas de pós-doutorado. O projeto contou com cinco Bolsas FAPESP de Iniciação Científica, uma de Doutorado e as duas de Pós-Doutorado.
O Temático ainda promoveu trabalhos em parcerias com diversas instituições nacionais como a Universidade Federal de Santa Catarina (UFSC), a Universidade Federal de Lavras (UFL), a Universidade Federal de São Carlos (UFSCar), o Instituto Nacional de Ciência e Tecnologia (INCT) de Entomologia Molecular do qual o IQ-USP faz parte e a Escola Superior de Agricultura Luiz de Queiroz (Esalq) também da USP.
O grupo ainda participa de um consórcio internacional para o sequenciamento do genoma do barbeiro Rhodnius prolixus cujos resultados ainda estão em análise e, de acordo com Terra, ainda devem gerar diversas aplicações práticas.
Here’s something you probably learned once in a biology class, more or less. There’s this molecule called DNA. It contains a long code that created you and is unique to you. And faithful copies of the code live inside the nucleus of every one of the trillions of cells in your body.
In a later class you may have learned a few exceptions to that “faithful copies” bit. Sometimes, especially during development, when cells are dividing into more cells, a mutation pops up in the DNA of a daughter cell. This makes the daughter cell and all of its progeny genetically distinct. The phenomenon is called ‘somatic mosaicism’, and it tends to happen in sperm cells, egg cells, immune cells, and cancer cells. But it’s pretty infrequent and, for most healthy people, inconsequential.
That’s what the textbooks say, anyway, and it’s also a common assumption in medical research. For instance, genetic studies of living people almost always collect DNA from blood draws or cheek swabs, even if investigating the tangled roots of, say, heart disease or diabetes or autism. The assumption is that whatever genetic blips show up in blood or saliva will recapitulate what’s in the (far less accessible) cells of the heart, pancreas, or brain.
Two recent reports suggest that somatic mosaicism is far more common than anybody ever realized — and that might be a good thing.
Colored bars show the locations of genetic glitches in tissues from each of the six subjects (inner vertical numbers). The numbers on the outer edge of the circle correspond to each of our 23 chromosomes, and each color represents a different organ. Image courtesy of PNAS
In the first study Michael Snyder and colleagues looked at cells in 11 different organs and tissues obtained from routine autopsies of six unrelated people who had not died of cancer or any hereditary disease.
Then the scientists screened each tissue for small deletions or duplications of DNA, called copy number variations, or CNVs. These are fairly common in all of us.
In order to do genetic screens, researchers have to mash up a bunch of cells and pull DNA out of the aggregate. That makes research on somatic mutations tricky, because you can’t tell how some cells in the tissue might be different from others. The researchers got around that problem by doing side-by-side comparisons of the tissues from each person. If one tissue has a CNV and the other one doesn’t, they reasoned, then it must be a somatic glitch.
As they reported in October in the Proceedings of the National Academy of Sciences, Snyder’s team found a total of 73 somatic CNVs in the six people, cropping up in tissues all over the body, including the brain, liver, pancreas and small intestine. “Your genome is not static — it does change through development,” says Snyder, chair of the genetics department at Stanford. “People knew that, but it had never been systematically studied.”
OK, but do somatic mutations do anything? It’s hard to tell, particularly because postmortem studies offer no living person to observe. Still, the scientists showed that 79 percent of the somatic mutations fell inside of genes, and most of those genes play a role in the cell’s everyday regulatory processes, like metabolism, phosphorylation, and turning genes on. So the somatic mutations could very well have had an impact.
In the last paragraph of their paper the researchers mention that the findings could also have big implications for studies of induced pluripotent stem (iPS) cells. This line of research is getting increasingly popular, for good reason. With iPS technology, researchers start with a small piece of skin (or…) from a living person. They then expose those skin cells to a certain chemical concoction that reprograms them back into a primordial state. Once the stem cells are created, researchers can put them in yet another chemical soup that coaxes them to differentiate into whatever type of cell the scientists want to study. You can see why it’s cool: The technique allows scientists to create cells — each holding an individual’s unique DNA code, remember — in a Petri dish. Researchers can study neurons of children with autism, for example, without ever touching their brains.
Trouble is, several groups have reported that iPS cells carry mutations that the original skin cells don’t have. This suggests that something screwy is happening during the reprogramming process, defeating the whole purpose of making the cells. (Fellow Phenomena contributor Ed Yong wrote a fantastic post about the hoopla last year.)
But that last paragraph of Snyder’s study offers a bit of hope. What if the mutations that crop up in iPS cells actually were in the skin cells they came from, but just didn’t get picked up because those skin cells were mixed with other skin cells that didn’t have the mutations? In other words, what if skin cells, like all those other tissues they looked at in the paper, are mosaics?
Flora Vaccarino‘s team at Yale sequenced the entire genome of 21 iPS cell lines, three each from seven people, as well as the skin cells that the iPS cells originated from. It turns out that each iPS line has an average of two CNVs and that at least half of these come from somatic mutations in the skin cells. (The researchers used special techniques for amplifying the DNA of the skin cells, so that they could detect CNVs that are present only in a fraction of the cells.)
That means two things. First, researchers using iPS cells can exhale. Their freaky reprogramming process doesn’t seem to create too much genetic havoc in the iPS cells. And second, somatic mosaicism happens a lot. Vaccarino’s study estimates that a full 30 percent of the skin cells carry somatic mutations.
Our widespread mosaicism may have implications for certain diseases. Somatic mutations have been strongly linked to tumors, for example, so it could be that people who have a lot of mosaicism are at a higher risk of cancer. But there’s also a positive way to spin it. Somatic mutations give our genomes an extra layer of flexibility, in a sense, that can come in handy. Snyder gives a good example in his study. If you have a group of cells that are constantly exposed to viruses, say, then it might be beneficial to have a somatic mutation pop up that damages receptors on the cell that viruses can latch on to.
But there’s likely a more parsimonious explanation for all of those genetic copying mistakes. “When you’re replicating DNA, there’s a certain expense to keep everything perfect,” Snyder says, meaning that it would cost the cell a lot of energy to ensure that every new cell was identical to the last. And in the end, he adds, that extra expense may not be worth it. “Having imperfections could just be an economically beneficial way for organisms to do things.”
Models with body mass index below 18.5 may not be shown in Israeli media, on websites or go down catwalk at fashion shows.
Starting on Tuesday, female and male models who have a body mass index (BMI) of less than 18.5 may not be shown in the media or on Israeli websites or go down the catwalk at fashion shows.
The law, initiated by then-Kadima MK Rachel Adatto, aims to protect impressionable teens from eating disorders.
Every year, an average of 30 young adults and teens die of anorexia or bulimia.
The law, also sponsored by Likud-Beytenu MK Danny Danon and believed to be the first of its kind in the world, does make violations a criminal offense bearing a fine. But violators can be sued in court by interested citizens, including families whose relatives have suffered or died due to eating disorders encouraged by images of overly thin models.
While the media that publish or present illegal images are not liable, they will get a bad image for doing so; the company that produced the ad, ran the fashion show or used the overly skinny presenter can be taken to court.
In addition, any advertisement made to look with Photoshop or other graphics programs as if the model has a BMI under 18.5 has to be labeled with the warning that the image was distorted. The warning must be clear and prominent, covering at least 7 percent of the ad space.
BMI is defined as an individual’s weight in kilos divided by the square of his or her height in meters. Would-be models in campaigns and fashion shows must first obtain and show written statements from their physician stating that their BMI – up to a maximum of three months ago – was above 18.5.
If not, they can not appear.
Adatto, a gynecologist by profession who is not likely to return to the Knesset because since she joined The Tzipi Livni Party and was placed in a low position on the list, said that on January 1, a “revolution against the anorexic model of beauty begins. Overly skinny models who look as if they eat a biscuit a day and then serve as a model for our children” will no longer be visible.
Every year, some 1,500 teenagers develop an eating disorder, and 5% of those suffering from anorexia die each year. The problem even effects the ultra-Orthodox community because some haredi men increasingly demand very-thin brides.
Adi Barkan, a veteran fashion photographer and model agent who “repented” and is in the Israel Center for the Change in Eating Habits and a prime advocate for Adatto’s bill, said: “We are all affected. We wear black, do [drastic] diets and are obsessive about our looks. The time has come for the end of the era of skeletons on billboards and sickly thinness all over. The time has come to think about ourselves and our children and take responsibility for what we show them. Too thin is not sexy.”
The Second Authority for Television and Radio, which regulates commercially operated television and radio broadcasts, has already issued instructions to its employees to observe the new law.
Por Felipe Frazão | Estadão Conteúdo – 11 horas atrás (Yahoo Notícias)
O Ministério da Saúde vai tornar compulsória a notificação de todas as pessoas infectadas com o vírus HIV, mesmo as que não desenvolveram a doença. A portaria ministerial que trata da obrigatoriedade de aviso de todos os casos de detecção do vírus da aids no País deve ser publicada em janeiro.
Atualmente, médicos e laboratórios informam ao Ministério da Saúde apenas os casos de pacientes que possuem o HIV e tenham, necessariamente, manifestado a doença. Os dados serão mantidos em sigilo. Somente as informações de perfil (sem a identificação do nome) poderão ser divulgadas para fins estatísticos.
Hoje, o governo monitora os soropositivos sem aids de maneira indireta. As informações disponíveis são de pessoas que fizeram a contagem de células de defesa nos serviços públicos ou estão cadastradas para receber antirretrovirais pelo Sistema Único de Saúde (SUS). O novo banco de dados será usado para planejamento de políticas públicas de prevenção e tratamento da aids.
“Para a saúde pública é extremamente importante, porque nós vamos poder saber realmente quantas pessoas estão infectadas e o tipo de serviços que vamos precisar”, explica Dirceu Grego, diretor do Departamento de DST, Aids e Hepatites Virais do Ministério da Saúde.
A mudança ocorre quatro meses após o governo anunciar a ampliação do acesso ao tratamento com medicação antirretroviral oferecido pelo SUS. A prescrição passou a ser feita em estágios menos avançados da aids.
Desde então, casais com um dos parceiros soropositivo passaram a ter acesso à terapia em qualquer estágio da doença.
O ministério também recomendou que a droga seja ministrada de forma mais precoce para quem não têm sintomas de aids, mas possui o vírus no organismo – uma tendência na abordagem da doença, reforçada na última Conferência Internacional de Aids, realizada em julho deste ano nos Estados Unidos.
À época, o ministério calculou que o número de brasileiros com HIV fazendo uso dos antirretrovirais aumentaria em 35 mil. Atualmente, são cerca de 220 mil pacientes com aids.
Outras 135 mil pessoas, estima o governo, têm o HIV, mas não sabem. Elas estão no foco da mudança na obrigatoriedade de notificação, porque não foram ainda diagnosticadas. Segundo Grego, essas pessoas devem ser incorporadas ao tratamento. Assim como ocorre quando os pacientes são diagnosticados com aids, caberá aos médicos e laboratórios avisar ao ministério sobre a descoberta de pessoas infectadas – os soropositivos. As informações são do jornal O Estado de S.Paulo.
Dec. 17, 2012 — Physicians should not prescribe cognitive enhancers to healthy individuals, states a report being published today in the Canadian Medical Association Journal (CMAJ). Dr. Eric Racine and his research team at the IRCM, the study’s authors, provide their recommendation based on the professional integrity of physicians, the drugs’ uncertain benefits and harms, and limited health care resources.
Prescription stimulants and other neuropharmaceuticals, generally prescribed to treat attention deficit disorder (ADD), are often used by healthy people to enhance concentration, memory, alertness and mood, a phenomenon described as cognitive enhancement.
“Individuals take prescription stimulants to perform better in school or at work,” says Dr. Racine, a Montréal neuroethics specialist and Director of the Neuroethics research unit at the IRCM. “However, because these drugs are available in Canada by prescription only, people must request them from their doctors. Physicians are thus important stakeholders in this debate, given the risks and regulations of prescription drugs and the potential for requests from patients for such cognitive enhancers.”
The prevalence of cognitive enhancers used by students on university campuses ranges from 1 per cent to 11 per cent. Taking such stimulants is associated with risks of dependence, cardiovascular problems, and psychosis.
“Current evidence has not shown that the desired benefits of enhanced mental performance are achieved with these substances,” explainsCynthia Forlini, first author of the study and doctoral student in Dr. Racine’s research unit. “With uncertain benefits and clear harms, it is difficult to support the notion that physicians should prescribe a medication to a healthy individual for enhancement purposes.”
“Physicians in Canada provide prescriptions through a publicly-funded health care system with expanding demands for care,” adds Ms. Forlini. “Prescribing cognitive enhancers may therefore not be an appropriate use of resources. The concern is that those who need the medication for health reasons but cannot afford it will be at a disadvantage.”
“An international bioethics discussion has surfaced on the ethics of cognitive enhancement and the role of physicians in prescribing stimulants to healthy people,” concludes Dr. Racine. “We hope that our analysis prompts reflection in the Canadian medical community about these cognitive enhancers.”
Éric Racine’s research is funded through a New Investigator Award from the Canadian Institutes for Health Research (CIHR). The report’s co-author is Dr. Serge Gauthier from the McGill Centre for Studies in Aging.
Cynthia Forlini, Serge Gauthier, and Eric Racine. Should physicians prescribe cognitive enhancers to healthy individuals?Canadian Medical Association Journal, 2012; DOI: 10.1503/cmaj.121508
Dec. 14, 2012 — Higher rates of schizophrenia in urban areas can be attributed to increased deprivation, increased population density and an increase in inequality within a neighbourhood, new research reveals. The research, led by the University of Cambridge in collaboration with Queen Mary University of London, was published today in the journal Schizophrenia Bulletin.
Dr James Kirkbride, lead author of the study from the University of Cambridge, said: “Although we already know that schizophrenia tends to be elevated in more urban communities, it was unclear why. Our research suggests that more densely populated, more deprived and less equal communities experience higher rates of schizophrenia and other similar disorders. This is important because other research has shown that many health and social outcomes also tend to be optimal when societies are more equal.”
The scientists used data from a large population-based incidence study (the East London first-episode psychosis study directed by Professor Jeremy Coid at the East London NHS Foundation Trust and Queen Mary, University of London) conducted in three neighbouring inner city, ethnically diverse boroughs in East London: City & Hackney, Newham, and Tower Hamlets.
427 people aged 18-64 years old were included in the study, all of whom experienced a first episode of psychotic disorder in East London between 1996 and 2000. The researchers assessed their social environment through measures of the neighbourhood in which they lived at the time they first presented to mental health services because of a psychotic disorder. Using the 2001 census, they estimated the population aged 18-64 years old in each neighbourhood, and then compared the incidence rate between neighbourhoods.
The incidence of schizophrenia (and other similar disorders where hallucinations and delusions are the dominant feature) still showed variation between neighbourhoods after taking into account age, sex, ethnicity and social class. Three environmental factors predicted risk of schizophrenia — increased deprivation (which includes employment, income, education and crime) increased population density, and an increase in inequality (the gap between the rich and poor).
Results from the study suggested that a percentage point increase in either neighbourhood inequality or deprivation was associated with an increase in the incidence of schizophrenia and other similar disorders of around 4%.
Dr Kirkbride added: “Our research adds to a wider and growing body of evidence that inequality seems to be important in affecting many health outcomes, now possibly including serious mental illness. Our data seems to suggest that both absolute and relative levels of deprivation predict the incidence of schizophrenia.
“East London has changed substantially over recent years, not least because of the Olympic regeneration. It would be interesting to repeat this work in the region to see if the same patterns were found.”
The study also found that risk of schizophrenia in some migrant groups might depend on the ethnic composition of their neighbourhood. For black African people, the study found that rates tended to be lower in neighbourhoods where there were a greater proportion of other people of the same background. By contrast, rates of schizophrenia were lower for the black Caribbean group when they lived in more ethnically-integrated neighbourhoods. These findings support the possibility that the socio-cultural composition of our environment could positively or negatively influence risk of schizophrenia and other similar disorders.
Dr John Williams, Head of Neuroscience and Mental Health at the Wellcome Trust said: “This research reminds us that we must understand the complex societal factors as well as the neural mechanisms that underpin the onset of mental illness, if we are to develop appropriate interventions.”
J. B. Kirkbride, P. B. Jones, S. Ullrich, J. W. Coid. Social Deprivation, Inequality, and the Neighborhood-Level Incidence of Psychotic Syndromes in East London.Schizophrenia Bulletin, 2012; DOI: 10.1093/schbul/sbs151
Crianças negras e mestiças em Araçuaí, Minas Gerais. Foto: Rodrigo Dai – Cortesia Ser Criança
por Fabiana Frayssinet, da IPS
Rio de Janeiro, Brasil, 16/11/2012 – Entre a emergência de uma parturiente negra e uma branca, o médico brasileiro escolhe a branca porque “as negras são mais resistentes à dor e estão acostumadas a parir”. As convenções culturais e sociais brasileiras “imputam ao negro condições de estereótipo, que fazem com que não tenha as mesmas garantias de tratamento da saúde que um branco”, disse à IPS a psicóloga Crisfanny Souza Soares, da Rede Nacional de Controle Social e Saúde da População Negra. Estes estereótipos refletem um racismo que faz mal à saúde e que uma campanha tenta extirpar do sistema hospitalar brasileiro.
Dos 192 milhões de brasileiros, metade se reconhece como negra. A Mobilização Nacional Pró-Saúde da População Negra, foi lançada este ano por organizações de afro-brasileiros, com apoio do Fundo de População das Nações Unidas (UNFPA). Sob o lema “Vida longa, com saúde e sem racismo”, o objetivo da campanha é a saúde integral em todas as fases da vida, incentivando a sociedade, e em particular o sistema sanitário, a combater a discriminação para reduzir os altos índices de mortalidade da população de origem africana.
“Praticamente, todos os índices de saúde da mulher negra são piores do que os da branca. Em uma consulta sobre câncer de mama, as negras são menos apalpadas do que as brancas; e recebem menos anestesia no parto”, afirma Crisfanny. O Ministério da Saúde, que desde 2006 impulsiona uma política nacional integral para este grupo de população no contexto do Sistema Único de Saúde (SUS), realiza estudos para detectar este tipo de situação.
“A ideia de que a população negra é mais resistente à dor e tem melhores condições de conviver com a doença está presente em todo o sistema de saúde, desde os técnicos de enfermagem até os médicos”, afirmou Deise Queiroz, coordenadora da Articulação de Jovens Negras, da Bahia. Ela conhece bem isso, especialmente porque sua mãe, que sofre de diabete e pressão alta, deve recorrer com frequência ao sistema público de saúde. Segundo a ativista, o SUS, que foi um modelo de democratização do serviço de saúde, hoje não consegue atender tanta demanda, e “as atitudes racistas ficam mais evidentes”.
A Constituição determina que a saúde é um direito universal e o Estado tem o dever de proporcioná-la. O SUS estabelece que “todas as pessoas têm direito ao tratamento de qualidade humanizado e sem nenhuma discriminação”. Entretanto, o racismo se infiltra aberta ou sutilmente. “Ele se incorpora nas condições de vida da população, na organização dos serviços de saúde e na formulação de políticas”, explicou à IPS a representante auxiliar do UNFPA no Brasil, Fernanda Lopes. “Por isto é necessário construir políticas específicas de equidade”, afirmou.
Um estudo epidemiológico do Ministério da Saúde apresenta informação específica para ajudar a preencher esses vazios, ao comparar indicadores como assistência pré-natal por raça, cor e etnia. Também analisa outros aspectos, como o direito e o acesso a planejamento familiar, que é mais precário entre as afrodescendentes. Precisamente este aspecto é o centro do informe mundial do UNFPA, apresentado no dia 14, com o título Sim à Opção, não ao Acaso – Planejamento da Família, Direitos Humanos e Desenvolvimento.
Por exemplo, 19% das crianças nascidas vivas são de mães adolescentes brancas entre 15 e 19 anos. Contudo, a incidência de gravidez em adolescentes é de 29% entre as jovens afro-brasileiras da mesma faixa etária. Além disso, enquanto 62% das mães de crianças brancas informavam ter realizado sete ou mais consultas pré-natais, apenas 37% das mães de recém-nascidos mulatos e negros realizaram essa quantidade de exames antes do parto.
A mortalidade infantil também apresenta disparidades. O risco de uma criança negra ou mulata morrer antes dos cinco anos de idade por doenças infecciosas e parasitárias é 60% maior em relação a uma criança branca. E o de morte por desnutrição é 90% superior. O estudo também constatou que morrem mais grávidas afrodescendentes do que brancas por causas vinculadas à gestação, como hipertensão.
“Dizem que os piores índices sanitários da população negra se deve ao fato de a maioria ser pobre e, por isso, mais vulnerável”, apontou Crisfanny. Porém, não se pode negar outras variáveis estritamente racistas, advertiu. “Se em um hospital vemos dois jovens baleados, é mais fácil o imaginário cultural colocar o branco no papel de vítima, enquanto o negro estaria ali porque se envolveu em um crime”, ressaltou a psicóloga, afirmando que às vezes essa referência “faz com que um profissional estabeleça prioridades no atendimento”.
Outra preocupação se refere às doenças prevalentes na população afrodescendente, como anemia falciforme, diabete mellitus Tipo II e hipertensão, que o sistema sanitário não está preparado para abordar de maneira específica. As mulheres negras têm 50% mais possibilidades de desenvolver esse tipo de diabete, com o agravante de a hipertensão arterial entre elas ser duas vezes maior do que na população em geral.
O mesmo ocorre com a anemia falciforme, que poderia ser detectada nos recém-nascidos. Segundo a Mobilização Nacional Pró-Saúde, cerca de 3.500 crianças brasileiras nascem a cada ano com essa enfermidade, fazendo dela a doença genética de maior incidência no país. “A população negra morre, em geral, mais cedo, e suas mortes por causas evitáveis são mais frequentes”, pontuou Fernanda. Por isso, uma política para combater a discriminação na saúde “chega para minimizar o impacto das desigualdades históricas mediante estratégias de ação afirmativa”, acrescentou.
O UNFPA contribui com o governo e o movimento negro para fortalecer essa política e a formação profissional que deve acompanhá-la. “O desafio é responder por que, em um país onde a população negra representa 50,3% do total, temos um quadro sanitário tão diferenciado” entre negros e brancos, admite o Ministério da Saúde. Envolverde/IPS
Sergio Arthuro é médico, doutor em Psicobiologia e divulgador científico. Artigo enviado ao JC Email pelo autor.
A imagem de nós cientistas no senso comum, como estereotipada por Einstein, é que somos meio loucos. De fato, como revelado recentemente pela revista Nature, parece que realmente não temos uma boa saúde mental, dada a alta ocorrência de depressão entre pós-graduandos e pós-doutorandos.
Os pós-graduandos são os estudantes de mestrado e de doutorado, enquanto os pós-doutorandos são os recém doutores em aperfeiçoamento, que ainda não conseguiram um emprego estável. Os pós-doutorandos são comuns há muito tempo nos laboratórios da Europa e dos Estados Unidos, já no Brasil este é um fenômeno recente.
Segundo o texto, boa parte dos estudantes de pós-graduação que desenvolvem depressão foram ótimos estudantes na graduação. Lauren, doutoranda em química na Universidade do Reino Unido, começou com dificuldade em focar nas atividades acadêmicas, evoluiu com medo de apresentar a própria pesquisa, e terminou sem nem mesmo conseguir sair da cama. Felizmente, Lauren buscou ajuda e agora está terminando o seu doutorado, tendo seu caso relatado no site de ajuda Students Against Depression, cujo objetivo é “desenvolver a consciência de que a depressão não é uma falha pessoal ou uma fraqueza, mas sim uma condição séria que requer tratamento”, segundo a psicóloga Denise Meyer, que ajudou no desenvolvimento do site.
Para os cientistas em início de carreira, a competição no meio acadêmico pode levar a isolamento, ansiedade e insônia, que podem gerar depressão. Esta pode ser acentuada se o estudante de pós-graduação tiver problemas extracurriculares e/ou com seu orientador. Já que a depressão altera significativamente a capacidade de fazer julgamento racional, o deprimido perde a capacidade de se reconhecer como tal. Aqui, na minha opinião, o orientador tem um papel fundamental, mas que na prática não tenho observado muito: não se preocupar apenas com os resultados dos experimentos, mas também com a pessoa do estudante.
De acordo com o texto, os principais sinais de depressão são: a) inabilidade de assistir as aulas e/ou fazer pesquisa, b) dificuldade de concentração, c) diminuição da motivação, d) aumento da irritabilidade, e) mudança no apetite, f) dificuldades de interação social, g) problemas no sono, como dificuldade para dormir, insônia ou sono não restaurativo (a pessoa dorme muito, mas acorda cansada e tem sono durante o dia).
Segundo o texto, a maioria das universidades não tem um serviço que possa ajudar os estudantes de pós-graduação. Não obstante, formas alternativas se mostraram relativamente eficazes. Por exemplo, mestrandos e doutorandos poderiam procurar ajuda em serviços oferecidos a alunos de graduação; já os pós-doutorandos poderiam tentar ajuda em serviços oferecidos a professores, sugerem os autores do texto. A maioria dos tratamentos requer apenas uma sessão em que são discutidas as dificuldades dos estudantes, além de sugestões de como manejar melhor a depressão. Uma das principais preocupações é com relação à confidencialidade, que deve ser quebrada apenas se o profissional sentir que o paciente tem chance iminente de ferir a si ou a outrem. Segundo Sharon Milgram, diretora do setor de treinamento e educação do Instituto Nacional de Saúde dos Estados Unidos, “buscar ajuda é um sinal de força, e não de fraqueza”.
Devo admitir que o texto chamou minha atenção por me identificar com o tema, tanto na minha própria experiência, quanto na de vários colegas de pós-graduação que também enfrentaram problemas semelhantes. Acho que o sistema atual de pós-graduação tem falhas que podem aumentar os casos de depressão, como as descritas a seguir:
1- O próprio nome “Defesa” no caso do doutorado
Tem coisa mais agressiva que isso? Defesa pressupõe ataque, é isso mesmo que queremos? Algumas pessoas vão dizer que os ataques são às ideias e não às pessoas. Acho que isso acontece apenas no mundo ideal, porque na prática o limite entre as ideias e as pessoas que tiveram as ideias é muito tênue. Mas pior é nos países de língua espanhola, pois lá a banca é chamada de “tribunal”.
2- Avaliações pouco frequentes
Em vários casos, principalmente no começo do projeto, as avaliações são pouco frequentes, o que faz com que o desespero fique todo para o final. No meu caso, os últimos meses antes da “Defesa” foram os piores da minha vida, pois tive bastante insônia, vontade de desistir de tudo etc. Pior também foi ouvir das pessoas que poderiam me ajudar que aquilo era “normal” e que “fazia parte do processo”… Isso não aconteceu apenas comigo, mas com vários colegas de pós-graduação. Acho que para fazer ciência bem feita, como todo trabalho, tem que ser prazeroso, e acredito que avaliações mais frequentes podem evitar o estresse ao final do trabalho.
3 – Prazos pouco flexíveis
Cada vez mais me é claro que a ciência não é linear, e previsões geralmente são equivocadas. Dessa forma, acredito que não deveria haver nem mestrado nem doutorado com prazo fixo. O pós-graduando deveria ter bolsa por cinco anos para desenvolver sua pesquisa, e a cada ano elaboraria um relatório sobre suas atividades e resultados. Uma comissão deveria julgar esse relatório para ver se o estudante merece continuar. Como cada caso é um caso, em alguns casos, dois anos já seriam suficientes para ter um resultado que possa ser publicado num jornal científico de reputação. Isso daria ao cientista a possibilidade de bolsa por mais cinco anos, por exemplo, para ele continuar sua pesquisa. Em outros casos, cinco anos de trabalho não é suficiente, o que pode ser por causa da própria complexidade da pesquisa, ou outros motivos como atraso na importação de material etc. Nesse caso, acho que o estudante deveria ter pelo menos mais três anos de tolerância para poder concluir sua pesquisa, caso os relatórios anuais sejam aprovados, e o estudante comprove que não é por sua culpa que a pesquisa está demorando mais que o previsto.
Senti falta no texto uma discussão com relação ao fato de que para os futuros cientistas que ainda não tem um emprego definitivo, a ausência de estabilidade financeira é também um fator que contribui para o estado de humor dessa classe tão específica e especial de seres humanos.
Sugestão de Leitura:
Gewin, V. (2012) Under a cloud: Depression is rife among graduate students and postdocs. Universities are working to get them the help they need. Nature 490, 299-301
It’s nearly identical, and suggests patterns evolved before the two split and went own ways
Chimpanzees at Gombe Stream National Park in Tanzania have a lot in common with humans. And they both like to eat, apparently. Photo: Ian Gilby
By Stephanie Pappas
updated 11/13/2012 3:30:35 PM ET
Humans share about 99 percent of our genomes with chimpanzees. Now, research finds we share something else: gut bacteria.
The bacterial colonies that populate the chimpanzee intestinal tract are mirror images of those found in the human gut, researchers report Tuesday in the journal Nature Communications. The findings suggest gut bacteria patterns evolved before chimps and humans split and went their evolutionarily separate ways.
In 2011, researchers learned that everyone’s gut bacteria fall into one of three different types, almost analogous to blood types. In each type, certain bacteria dominate. These types weren’t linked to any personal characteristics such as geographic area, age or gender. Researchers dubbed these distinct bacterial ecosystems “enterotypes.” (“Entero” means gut or intestine.)
“No one really knows why these three enterotypes exist,” said study researcher Andrew Moeller, a doctoral student at Yale University.
Along with his adviser Howard Ochman and their colleagues, Moeller wants to understand how these enterotypes arose. They could be distinctly human, he told LiveScience, which would suggest they arose relatively recently, perhaps in response to the development of agriculture. Or they could be ancient, shared among our closest primate relatives.
The researchers analyzed gut bacteria samples from 35 chimpanzees from Gombe Stream National Park in Tanzania. The chimpanzees were all in the subspecies Pan troglodytes schweinfurthii, the eastern chimpanzee, which arose about the same time as Homo sapiens.
The researchers found that, just like humans, chimps’ guts harbor one of three distinct types of bacterial colonies. Even more intriguingly, these enterotypes matched humans’ precisely. In type 1, for example, both humans and chimps show a predominance of Bacteroides,Faecalibacterium and Parabacteroides.
There were some differences. For example, in humans and chimps, enterotype 2 is marked by an overabundance of bacteria called Lachnospiraceae. In humans, the bacteria Prevotellae is also prevalent in type 2. In chimps, Prevotellae appears in significant numbers in all three enterotypes, perhaps because it is associated with a high-carbohydrate diet.
Other differences could help explain certain human health issues. By comparing human and chimpanzee gut bacteria, the researchers found many of the bacteria present only in humans are linked to diseases such as inflammatory bowel diseases, conditions that cause pain, diarrhea and vomiting.
Seven of the chimps in the study were tested repeatedly over eight years, and their gut microbes were found to change from type to type over that time period. No one has ever tested humans for changes over a period longer than two weeks, Moeller said, but the results suggest our enterotypes may shift over time, too.
Our shared history
The similarities between chimp and human colonies suggest enterotypes predate our species, which in turn suggests that none of the three ecosystems are better than the others, Moeller said.
“Before we found this in chimpanzees, there was a possibility that enterotypes were a product of modernization, which could mean they have some negative effects on health,” he said. “I don’t think there’s any reason to think one enterotype is going to have an effect on health that’s going to be better” than the others.
Moeller and his colleagues are now examining gorilla fecal samples to find out where they stand as slightly more distant primate relatives to humans.
“The next step is to try to find out the processes and mechanisms responsible for producing these three community states,” Moeller said, “which is kind of a lofty goal, but I think more sampling will actually reveal why these communities exist.”
Neste vídeo você pode ver um panorama das pessoas e idéias que se entrecruzam no “Terra Madre – encontro mundial das comunidades do alimento” (terramadre.org). Evento que, a cada dois anos, reúne na Itália cerca de 7mil pessoas vindas de 150 países, entre pequenos produtores, agricultores, pescadores artesanais, cozinheiros, pesquisadores e ativistas do Slow Food (slowfood.com). Pessoas comprometidas em defender e promover modos de produção que respeitam o meio-ambiente, atentos aos recursos naturais do planeta, à conservação da biodiversidade e justiça social.
Researchers credit environmental improvements, not genetics, for the increaseBy Trevor Stokes
updated 10/15/2012 7:11:26 PM ET
Humans are living longer than ever, a life-span extension that occurred more rapidly than expected and almost solely from environmental improvements as opposed to genetics, researchers said Monday.
Four generations ago, the average Swede had the same probability of dying as a hunter-gatherer, but improvements in our living conditions through medicine, better sanitation and clean drinking water (considered “environmental” changes) decreased mortality rates to modern levels in just 100 years, researchers found.
In Japan, 72 has become the new 30, as the likelihood of a 72-year-old modern-day person dying is the same as a 30-year-old hunter-gatherer ancestor who lived 1.3 million years ago. Though the researchers didn’t specifically look at the United States, they say the trends are not country-specific and not based in genetics.
Quick jump in life span The same progress of decreasing average probability of dying at a certain age in hunters-gatherers that took 1.3 million years to achieve was made in 30 years during the 21st century.
“I pictured a more gradual transition from a hunter-gatherer mortality profile to something like we have today, rather than this big jump, most of which occurred in the last four generations, to me that was surprise,” lead author Oskar Burger, postdoctoral fellow at the Max Planck Institute for Demographic Research in Germany, told LiveScience.
Biologists have lengthened life spans of worms, fruit flies and mice in labs by selectively breeding for old-age survivorship or tweaking their endocrine system, a network of glands that affects every cell in the body. However, the longevity gained in humans over the past four generations is even greater than can be created in labs, researchers concluded. [Extending Life: 7 Ways to Live Past 100]
Genetics vs. environment In the new work, Burger and colleagues analyzed previously published mortality data from Sweden, France and Japan, from present-day hunter-gatherers and from wild chimpanzees, the closet living relative to humans.
Humans have lived for an estimated 8,000 generations, but only in the past four have mortalities decreased to modern-day levels. Hunter-gatherers today have average life spans on par with wild chimpanzees.
The research suggests that while genetics plays a small role in shaping human mortality, the key in driving up our collective age lies with the advent of medical technologies, improved nutrition, higher education, better housing and several other improvements to the overall standards of living.
“This recent progress has been just astronomically fast compared to what we made since the split from chimpanzees,” Burger said.
Most of the brunt of decreased mortality comes in youth: By age 15, hunters and gatherers have more than 100 times the chance of dying as modern-day people.
What’s next? “In terms of what’s going on in the next four generations, I want to be very clear that I don’t make any forecasts,” Burger said. “We’re in a period of transition and we don’t know what the new stable point will be.”
However, some researchers say that humans may have maxed out their old age.
“These mortality curves (that show the probability of dying by a certain age), they are now currently at their lowest possible value, which makes a very strong prediction that life span cannot increase much more,” Caleb Finch, a neurogerontology professor at the University of Southern California who studies the biological mechanisms of aging, told LiveScience in an email.
Further, Finch, who was not involved in the current study, argues that environmental degradation, including climate change and ozone pollution, combined with increased obesity “are working to throw us back to an earlier phase of our improvements, they’re regressive.”
“It’s impossible to make any reasonable predictions, but you can look, for example, in local environments in Los Angeles where the density of particles in the air predict the rate of heart disease and cancer,” Finch said, illustrating the link between the environment and health.
The study was detailed Monday in the journal Proceedings of the National Academy of Sciences.
SAO PAULO — Brazil is using an indigenous language for the first in a campaign aimed at curbing violence against women and the spread of HIV.
The program includes folders warning that “violence or fear of violence increase women’s vulnerability to HIV infection and other sexually transmitted diseases” because women who fear violence can be forced to have unprotected sex.
To get the message across to indigenous populations, folders and pamphlets were prepared in Tikuna, which is spoken by more than 30,000 Indians in the western tip of Amazonas state. Educational material is being prepared in other indigenous languages as well.
The campaign is a joint effort between Brazil and three United Nations agencies including the Joint United Nations Program on HIV and AIDS (UNAIDS).
“Indigenous groups have the right to this information in their own language,” said Pedro Chequer, the UNAIDS director in Brazil.
The campaign using materials in Tikuna was launched after health workers tested about 20,000 Indians for sexually transmitted diseases and found 46 with syphilis and 16 with the virus that causes AIDS, said Dr. Adele Benzaken of the UNAIDS office in Brazil.
The Tikuna Indians live near Brazil’s borders with Peru and Colombia, where prostitution and drug trafficking are rife, Benzaken said by telephone.
She said the information regarding HIV among indigenous groups will create a baseline that can be referred to in future years to determine if the incidence of the disease is increasing in that population.
Copyright 2012 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.
Among the many mysteries of human biology is why complex diseases like diabetes, high blood pressure and psychiatric disorders are so difficult to predict and, often, to treat. An equally perplexing puzzle is why one individual gets a disease like cancer or depression, while an identical twin remains perfectly healthy.
Béatrice de Géa for The New York Times. “It is like opening a wiring closet and seeing a hairball of wires,” Mark Gerstein of Yale University said of the DNA intricacies.
Now scientists have discovered a vital clue to unraveling these riddles. The human genome is packed with at least four million gene switches that reside in bits of DNA that once were dismissed as “junk” but that turn out to play critical roles in controlling how cells, organs and other tissues behave. The discovery, considered a major medical and scientific breakthrough, has enormous implications for human health because many complex diseases appear to be caused by tiny changes in hundreds of gene switches.
The findings, which are the fruit of an immense federal project involving 440 scientists from 32 laboratories around the world, will have immediate applications for understanding how alterations in the non-gene parts of DNA contribute to human diseases, which may in turn lead to new drugs. They can also help explain how the environment can affect disease risk. In the case of identical twins, small changes in environmental exposure can slightly alter gene switches, with the result that one twin gets a disease and the other does not.
As scientists delved into the “junk” — parts of the DNA that are not actual genes containing instructions for proteins — they discovered a complex system that controls genes. At least 80 percent of this DNA is active and needed. The result of the work is an annotated road map of much of this DNA, noting what it is doing and how. It includes the system of switches that, acting like dimmer switches for lights, control which genes are used in a cell and when they are used, and determine, for instance, whether a cell becomes a liver cell or a neuron.
“It’s Google Maps,” said Eric Lander, president of the Broad Institute, a joint research endeavor of Harvard and the Massachusetts Institute of Technology. In contrast, the project’s predecessor, the Human Genome Project, which determined the entire sequence of human DNA, “was like getting a picture of Earth from space,” he said. “It doesn’t tell you where the roads are, it doesn’t tell you what traffic is like at what time of the day, it doesn’t tell you where the good restaurants are, or the hospitals or the cities or the rivers.”
The new result “is a stunning resource,” said Dr. Lander, who was not involved in the research that produced it but was a leader in the Human Genome Project. “My head explodes at the amount of data.”
The discoveries were published on Wednesday in six papers in the journal Nature and in 24 papers in Genome Research and Genome Biology. In addition, The Journal of Biological Chemistry is publishing six review articles, and Science is publishing yet another article.
Human DNA is “a lot more active than we expected, and there are a lot more things happening than we expected,” said Ewan Birney of the European Molecular Biology Laboratory-European Bioinformatics Institute, a lead researcher on the project.
In one of the Nature papers, researchers link the gene switches to a range of human diseases — multiple sclerosis, lupus, rheumatoid arthritis, Crohn’s disease, celiac disease — and even to traits like height. In large studies over the past decade, scientists found that minor changes in human DNA sequences increase the risk that a person will get those diseases. But those changes were in the junk, now often referred to as the dark matter — they were not changes in genes — and their significance was not clear. The new analysis reveals that a great many of those changes alter gene switches and are highly significant.
“Most of the changes that affect disease don’t lie in the genes themselves; they lie in the switches,” said Michael Snyder, a Stanford University researcher for the project, called Encode, for Encyclopedia of DNA Elements.
And that, said Dr. Bradley Bernstein, an Encode researcher at Massachusetts General Hospital, “is a really big deal.” He added, “I don’t think anyone predicted that would be the case.”
The discoveries also can reveal which genetic changes are important in cancer, and why. As they began determining the DNA sequences of cancer cells, researchers realized that most of the thousands of DNA changes in cancer cells were not in genes; they were in the dark matter. The challenge is to figure out which of those changes are driving the cancer’s growth.
“These papers are very significant,” said Dr. Mark A. Rubin, a prostate cancer genomics researcher at Weill Cornell Medical College. Dr. Rubin, who was not part of the Encode project, added, “They will definitely have an impact on our medical research on cancer.”
In prostate cancer, for example, his group found mutations in important genes that are not readily attacked by drugs. But Encode, by showing which regions of the dark matter control those genes, gives another way to attack them: target those controlling switches.
Dr. Rubin, who also used the Google Maps analogy, explained: “Now you can follow the roads and see the traffic circulation. That’s exactly the same way we will use these data in cancer research.” Encode provides a road map with traffic patterns for alternate ways to go after cancer genes, he said.
Dr. Bernstein said, “This is a resource, like the human genome, that will drive science forward.”
The system, though, is stunningly complex, with many redundancies. Just the idea of so many switches was almost incomprehensible, Dr. Bernstein said.
There also is a sort of DNA wiring system that is almost inconceivably intricate.
“It is like opening a wiring closet and seeing a hairball of wires,” said Mark Gerstein, an Encode researcher from Yale. “We tried to unravel this hairball and make it interpretable.”
There is another sort of hairball as well: the complex three-dimensional structure of DNA. Human DNA is such a long strand — about 10 feet of DNA stuffed into a microscopic nucleus of a cell — that it fits only because it is tightly wound and coiled around itself. When they looked at the three-dimensional structure — the hairball — Encode researchers discovered that small segments of dark-matter DNA are often quite close to genes they control. In the past, when they analyzed only the uncoiled length of DNA, those controlling regions appeared to be far from the genes they affect.
The project began in 2003, as researchers began to appreciate how little they knew about human DNA. In recent years, some began to find switches in the 99 percent of human DNA that is not genes, but they could not fully characterize or explain what a vast majority of it was doing.
The thought before the start of the project, said Thomas Gingeras, an Encode researcher from Cold Spring Harbor Laboratory, was that only 5 to 10 percent of the DNA in a human being was actually being used.
The big surprise was not only that almost all of the DNA is used but also that a large proportion of it is gene switches. Before Encode, said Dr. John Stamatoyannopoulos, a University of Washington scientist who was part of the project, “if you had said half of the genome and probably more has instructions for turning genes on and off, I don’t think people would have believed you.”
By the time the National Human Genome Research Institute, part of the National Institutes of Health, embarked on Encode, major advances in DNA sequencing and computational biology had made it conceivable to try to understand the dark matter of human DNA. Even so, the analysis was daunting — the researchers generated 15 trillion bytes of raw data. Analyzing the data required the equivalent of more than 300 years of computer time.
Just organizing the researchers and coordinating the work was a huge undertaking. Dr. Gerstein, one of the project’s leaders, has produced a diagram of the authors with their connections to one another. It looks nearly as complicated as the wiring diagram for the human DNA switches. Now that part of the work is done, and the hundreds of authors have written their papers.
“There is literally a flotilla of papers,” Dr. Gerstein said. But, he added, more work has yet to be done — there are still parts of the genome that have not been figured out.
That, though, is for the next stage of Encode.
* * *
Published: September 5, 2012
Rethinking ‘Junk’ DNA
A large group of scientists has found that so-called junk DNA, which makes up most of the human genome, does much more than previously thought.
GENES: Each human cell contains about 10 feet of DNA, coiled into a dense tangle. But only a very small percentage of DNA encodes genes, which control inherited traits like eye color, blood type and so on.
JUNK DNA: Stretches of DNA around and between genes seemed to do nothing, and were called junk DNA. But now researchers think that the junk DNA contains a large number of tiny genetic switches, controlling how genes function within the cell.
REGULATION: The many genetic regulators seem to be arranged in a complex and redundant hierarchy. Scientists are only beginning to map and understand this network, which regulates how cells, organs and tissues behave.
DISEASE: Errors or mutations in genetic switches can disrupt the network and lead to a range of diseases. The new findings will spur further research and may lead to new drugs and treatments.
ON THE face of it, the placebo effect makes no sense. Someone suffering from a low-level infection will recover just as nicely whether they take an active drug or a simple sugar pill. This suggests people are able to heal themselves unaided – so why wait for a sugar pill to prompt recovery?
New evidence from a computer model offers a possible evolutionary explanation, and suggests that the immune system has an on-off switch controlled by the mind.
It all starts with the observation that something similar to the placebo effect occurs in many animals, says Peter Trimmer, a biologist at the University of Bristol, UK. For instance, Siberian hamsters do little to fight an infection if the lights above their lab cage mimic the short days and long nights of winter. But changing the lighting pattern to give the impression of summer causes them to mount a full immune response.
Likewise, those people who think they are taking a drug but are really receiving a placebo can have a response which is twice that of those who receive no pills (Annals of Family Medicine, doi.org/cckm8b). In Siberian hamsters and people, intervention creates a mental cue that kick-starts the immune response.
There is a simple explanation, says Trimmer: the immune system is costly to run – so costly that a strong and sustained response could dangerously drain an animal’s energy reserves. In other words, as long as the infection is not lethal, it pays to wait for a sign that fighting it will not endanger the animal in other ways.
According to Humphrey’s picture, the Siberian hamster subconsciously acts on a cue that it is summer because food supplies to sustain an immune response are plentiful at that time of year. We subconsciously respond to treatment – even a sham one – because it comes with assurances that it will weaken the infection, allowing our immune response to succeed rapidly without straining the body’s resources.
Trimmer’s simulation is built on this assumption – that animals need to spend vital resources on fighting low-level infections. The model revealed that, in challenging environments, animals lived longer and sired more offspring if they endured infections without mounting an immune response. In more favourable environments, it was best for animals to mount an immune response and return to health as quickly as possible (Evolution and Human Behavior, doi.org/h8p). The results show a clear evolutionary benefit to switching the immune system on and off depending on environmental conditions.
“I’m pleased to see that my theory stands up to computational modelling,” says Humphrey. If the idea is right, he adds, it means we have misunderstood the nature of placebos. Farming and other innovations in the past 10,000 years mean that many people have a stable food supply and can safely mount a full immune response at any time – but our subconscious switch has not yet adapted to this. A placebo tricks the mind into thinking it is an ideal time to switch on an immune response, says Humphrey.
Paul Enck at the University of Tübingen in Germany says it is an intriguing idea, but points out that there are many different placebo responses, depending on the disease. It is unlikely that a single mechanism explains them all, he says.
ScienceDaily (Sep. 2, 2012) — For years, doctors treating those with HIV have recognized a relationship between how faithfully patients take the drugs they prescribe, and how likely the virus is to develop drug resistance. More recently, research has shown that the relationship between adherence to a drug regimen and resistance is different for each of the drugs that make up the “cocktail” used to control the disease.
HIV is shown attaching to and infecting a T4 cell. The virus then inserts its own genetic material into the T4 cell’s host DNA. The infected host cell then manufactures copies of the HIV. (Credit: iStockphoto/Medical Art Inc.)
New research conducted by Harvard scientists could help explain why those differences exist, and may help doctors quickly and cheaply design new combinations of drugs that are less likely to result in resistance.
As described in a September 2 paper in Nature Medicine, a team of researchers led by Martin Nowak, Professor of Mathematics and of Biology and Director of the Program for Evolutionary Dynamics, have developed a technique medical researchers can use to model the effects of various treatments, and predict whether they will cause the virus to develop resistance.
“What we demonstrate in this paper is a prototype for predicting, through modeling, whether a patient at a given adherence level is likely to develop resistance to treatment,” Alison Hill, a PhD student in Biophysics and co-first author of the paper, said. “Compared to the time and expense of a clinical trial, this method offers a relatively easy way to make these predictions. And, as we show in the paper, our results match with what doctors are seeing in clinical settings.”
The hope, said Nowak, is that the new technique will take some of the guesswork out of what is now largely a trial-and-error process.
“This is a mathematical tool that will help design clinical trials,” he said. “Right now, researchers are using trial and error to develop these combination therapies. Our approach uses the mathematical understanding of evolution to make the process more akin to engineering.”
Creating a model that can make such predictions accurately, however, requires huge amounts of data.
To get that data, Hill and Daniel Scholes Rosenbloom, a PhD student in Organismic and Evolutionary Biology and the paper’s other first author, turned to Johns Hopkins University Medical School, where Professor of Medicine and of Molecular Biology and Genetics Robert F. Siliciano was working with PhD student Alireza Rabi (also co-first author) to study how the HIV virus reacted to varying drug dosages.
Such data proved critical to the model that Hill, Rabi and Rosenbloom eventually designed, because the level of the drug in patients — even those that adhere to their treatment perfectly — naturally varies. When drug levels are low — as they are between doses, or if a dose is missed — the virus is better able to replicate and grow. Higher drug levels, by contrast, may keep the virus in check, but they also increase the risk of mutant strains of the virus emerging, leading to drug resistance.
Armed with the data from Johns Hopkins, Hill, Rabi and Rosenbloom created a computer model that could predict whether and how much the virus, or a drug-resistant strain, was growing based on how strictly patients stuck to their drug regimen.
“Our model is essentially a simulation of what goes on during treatment,” Rosenbloom said. “We created a number of simulated patients, each of whom had different characteristics, and then we said, ‘Let’s imagine these patients have 60 percent adherence to their treatment — they take 60 percent of the pills they’re supposed to.’ Our model can tell us what their drug concentration is over time, and based on that, we can say whether the virus is growing or shrinking, and whether they’re likely to develop resistance.”
The model’s predictions, Rosenbloom explained, can then serve as a guide to researchers as they work to design new drug cocktails to combat HIV.
While their model does hold out hope for simplifying the process of designing drug “cocktails,” Hill and Rosenbloom said they plan to continue to refine the model to take additional factors — such as multiple mutant-resistant strains of the virus and varying drug concentrations in other parts of the body — into effect.
“The prototype we have so far looks at concentrations of drugs in blood plasma,” Rosenbloom explained. “But a number of drugs don’t penetrate other parts of the body, like the brains or the gut, with the same efficiency, so it’s important to model these other areas where the concentrations of drugs might not be as high.”
Ultimately, though, both say their model can offer new hope to patients by helping doctors design better, cheaper and more efficient treatments.
“Over the past 10 years, the number of HIV-infected people receiving drug treatment has increased immensely,” Hill said. “Figuring out what the best ways are to treat people in terms of cost effectiveness, adherence and the chance of developing resistance is going to become even more important.”
Daniel I S Rosenbloom, Alison L Hill, S Alireza Rabi, Robert F Siliciano, Martin A Nowak. Antiretroviral dynamics determines HIV evolution and predicts therapy outcome. Nature Medicine, 2012; DOI: 10.1038/nm.2892
* * *
Anti-HIV Drug Simulation Offers ‘Realistic’ Tool to Predict Drug Resistance and Viral Mutation
ScienceDaily (Sep. 2, 2012) — Pooling data from thousands of tests of the antiviral activity of more than 20 commonly used anti-HIV drugs, AIDS experts at Johns Hopkins and Harvard universities have developed what they say is the first accurate computer simulation to explain drug effects. Already, the model clarifies how and why some treatment regimens fail in some patients who lack evidence of drug resistance. Researchers say their model is based on specific drugs, precise doses prescribed, and on “real-world variation” in how well patients follow prescribing instructions.
Johns Hopkins co-senior study investigator and infectious disease specialist Robert Siliciano, M.D., Ph.D., says the mathematical model can also be used to predict how well a patient is likely to do on a specific regimen, based on their prescription adherence. In addition, the model factors in each drug’s ability to suppress viral replication and the likelihood that such suppression will spur development of drug-resistant, mutant HIV strains.
“With the help of our simulation, we can now tell with a fair degree of certainty what level of viral suppression is being achieved — how hard it is for the virus to grow and replicate — for a particular drug combination, at a specific dosage and drug concentration in the blood, even when a dose is missed,” says Siliciano, a professor at the Johns Hopkins University School of Medicine and a Howard Hughes Medical Institute investigator. This information, he predicts, will remove “a lot of the current trial and error, or guesswork, involved in testing new drug combination therapies.”
Siliciano says the study findings, to be reported in the journalNature Medicine online Sept. 2, should help scientists streamline development and clinical trials of future combination therapies, by ruling out combinations unlikely to work.
One application of the model could be further development of drug combinations that can be contained in a single pill taken once a day. That could lower the chance of resistance, even if adherence is not perfect. Such future drug regimens, he says, will ideally strike a balance between optimizing viral suppression and minimizing risk of drug resistance.
Researchers next plan to expand their modeling beyond blood levels of virus to other parts of the body, such as the brain, where antiretroviral drug concentrations can be different from those measured in the blood. They also plan to expand their analysis to include multiple-drug-resistant strains of HIV.
Besides Siliciano, Johns Hopkins joint medical-doctoral student Alireza Rabi was a co-investigator in this study. Other study investigators included doctoral candidates Daniel Rosenbloom, M.S.; Alison Hill, M.S.; and co-senior study investigator Martin Nowak, Ph.D. — all at Harvard University.
Funding support for this study, which took two years to complete, was provided by the National Institutes of Health, with corresponding grant numbers R01-MH54907, R01-AI081600, R01-GM078986; the Bill and Melinda Gates Foundation; the Cancer Research Institute; the National Science Foundation; the Howard Hughes Medical Institute; Natural Sciences and Engineering Research Council of Canada; the John Templeton Foundation; and J. Epstein.
Currently, an estimated 8 million of the more than 34 million people in the world living with HIV are taking antiretroviral therapy to keep their disease in check. An estimated 1,178,000 in the United States are infected, including 23,000 in the state of Maryland.
Daniel I S Rosenbloom, Alison L Hill, S Alireza Rabi, Robert F Siliciano, Martin A Nowak. Antiretroviral dynamics determines HIV evolution and predicts therapy outcome. Nature Medicine, 2012; DOI: 10.1038/nm.2892
ScienceDaily (Aug. 30, 2012) — Reliance on supernatural explanations for major life events, such as death and illness, often increases rather than declines with age, according to a new psychology study from The University of Texas at Austin.
The study, published in the June issue of Child Development, offers new insight into developmental learning.
“As children assimilate cultural concepts into their intuitive belief systems — from God to atoms to evolution — they engage in coexistence thinking,” said Cristine Legare, assistant professor of psychology and lead author of the study. “When they merge supernatural and scientific explanations, they integrate them in a variety of predictable and universal ways.”
Legare and her colleagues reviewed more than 30 studies on how people (ages 5-75) from various countries reason with three major existential questions: the origin of life, illness and death. They also conducted a study with 366 respondents in South Africa, where biomedical and traditional healing practices are both widely available.
As part of the study, Legare presented the respondents with a variety of stories about people who had AIDS. They were then asked to endorse or reject several biological and supernatural explanations for why the characters in the stories contracted the virus.
According to the findings, participants of all age groups agreed with biological explanations for at least one event. Yet supernatural explanations such as witchcraft were also frequently supported among children (ages 5 and up) and universally among adults.
Among the adult participants, only 26 percent believed the illness could be caused by either biology or witchcraft. And 38 percent split biological and scientific explanations into one theory. For example: “Witchcraft, which is mixed with evil spirits, and unprotected sex caused AIDS.” However, 57 percent combined both witchcraft and biological explanations. For example: “A witch can put an HIV-infected person in your path.”
Legare said the findings contradict the common assumption that supernatural beliefs dissipate with age and knowledge.
“The findings show supernatural explanations for topics of core concern to humans are pervasive across cultures,” Legare said. “If anything, in both industrialized and developing countries, supernatural explanations are frequently endorsed more often among adults than younger children.”
The results provide evidence that reasoning about supernatural phenomena is a fundamental and enduring aspect of human thinking, Legare said.
“The standard assumption that scientific and religious explanations compete should be re-evaluated in light of substantial psychological evidence,” Legare said. “The data, which spans diverse cultural contexts across the lifespan, shows supernatural reasoning is not necessarily replaced with scientific explanations following gains in knowledge, education or technology.”
Cristine H. Legare, E. Margaret Evans, Karl S. Rosengren, Paul L. Harris. The Coexistence of Natural and Supernatural Explanations Across Cultures and Development. Child Development, 2012; 83 (3): 779 DOI:10.1111/j.1467-8624.2012.01743.x
ScienceDaily (Aug. 31, 2012) — Obesity rates in North America are a growing concern for legislators. Expanded waistlines mean rising health-care costs for maladies such as diabetes, heart disease and some cancers. One University of Alberta researcher says that if people do not take measures to get healthy, they may find that governments will throw their weight into administrative measures designed to help us trim the fat.
Nola Ries of the Faculty of Law’s Health Law and Science Policy Group has recently published several articles exploring potential policy measures that could be used to promote healthier behaviour. From the possibility of zoning restrictions on new fast-food outlet locations, mandatory menu labels, placing levies on items such as chips and pop or offering cash incentives for leading a more healthy and active lifestyle, she says governments at all levels are looking to adopt measures that will help combat both rising health-care costs and declining fitness levels. But she cautions that finding a solution to such a widespread, complex problem will require a multi-layered approach.
“Since eating and physical activity behaviour are complex and influenced by many factors, a single policy measure on its own is not going to be the magic bullet,” said Ries. “Measures at multiple levels — directed at the food and beverage industry, at individuals, at those who educate and those who restrict — must work together to be effective.”
Junk-food tax: A lighter wallet equals a lighter you?
Ries notes that several countries have already adopted tax measures against snack foods and beverages, similar to “sin taxes” placed on alcohol and tobacco. Although Canada has imposed its GST on various sugary and starchy snacks (no tax is charged on basic groceries such as meats, vegetables and fruits), Ries points to other countries such as France and Romania, where the tax rate is much higher. She says taxing products such as sugar-sweetened beverages would likely not only reduce consumption (and curb some weight gain) if the tax is high enough, but also provide a revenue stream to combat the problem on other levels.
“Price increases through taxation do help discourage consumption of ‘sin’ products, especially for younger and lower-income consumers,” said Ries. “Such taxes would provide a source of government revenue that could be directed to other programs to promote healthier lifestyles.”
Warning: This menu label may make you eat healthier
Ries notes that prevailing thought says putting nutrition-value information where consumers can see it will enable them to make better food choices. She says many locales in the United States have already implemented mandatory menu labelling. Even though some studies say menu labels do not have a significant impact on consumer behaviour, nutrition details might help some people make more informed eating choices.
“Providing information is less coercive than taxation and outright bans, so governments should provide information along with any other more restrictive measure,” said Ries. “If a more coercive policy is being implemented, it’s important for citizens to understand the rationale for it.”
Coaxing our way to good health?
Ries notes that some programs designed to create more active citizens, such as the child fitness tax credit, do not seem to have the desired effect. Yet, she says that offering incentives for living healthier and exercising more may have a greater impact on getting people active. She points to similar programs used for weight loss and smoking cessation, which had a positive effect on behaviour change, at least in the short term. More work needs to be done to establish an enticement plan with longer-term effects, one that may incorporate points accumulated for healthy types of behaviour that could be redeemed for health- and fitness-related products and services. She says investing money into more direct incentive programs may be more effective than messages that simply give general advice about healthy lifestyles.
“Instead of spending more money on educational initiatives to tell people what they already know — like eat your greens and get some exercise — I suggest it’s better to focus on targeted programs that help people make and sustain behaviour change,” said Ries. “Financial incentive programs are one option; the question there is how best to target such programs and to design them to support long-term healthy behaviour.”
ScienceDaily (Aug. 27, 2012) — Increased levels of depression as a result of discrimination could contribute to low birth weight babies.
Given the well-documented relationship between low birth weight and the increased risk of health problems throughout one’s lifespan, it is vital to reduce any potential contributors to low birth weight. A new study by Valerie Earnshaw and her colleagues from Yale University sheds light on one possible causal factor. Their findings, published online in Springer’s journal, theAnnals of Behavioral Medicine, suggest that chronic, everyday instances of discrimination against pregnant, urban women of color may play a significant role in contributing to low birth weight babies.
Twice as many black women give birth to low birth weight babies than white or Latina women in the U.S. Reasons for this disparity are, as yet, unclear. But initial evidence suggests a link may exist between discrimination experienced while pregnant and the incidence of low birth weight. In addition, experiences of discrimination have also been linked to depression, which causes physiological changes that can have a negative effect on a pregnancy.
Earnshaw and her colleagues interviewed 420, 14- to 21-year-old black and Latina women at 14 community health centers and hospitals in New York, during the second and third trimesters of their pregnancies, and at six and 12 months after their babies had been born. They measured their reported experiences of discrimination. They also measured their depressive symptoms, pregnancy distress and pregnancy symptoms.
Levels of everyday discrimination reported were generally low. However, the impact of discrimination was the same in all the participants regardless of age, ethnicity or type of discrimination reported. Women reporting greater levels of discrimination were more prone to depressive symptoms, and ultimately went on to have babies with lower birth weights than those reporting lower levels of discrimination. This has implications for healthcare providers who work with pregnant teens and young women during the pre-natal period, while they have the opportunity to try and reduce the potential impacts discrimination on the pregnancy.
The authors conclude that “Given the associations between birth weight and health across the life span, it is critical to reduce discrimination directed at urban youth of color so that all children are able to begin life with greater promise for health. In doing so, we have the possibility to eliminate disparities not only in birth weight, but in health outcomes across the lifespan.”
Data for this study came from the Centering Pregnancy Plus project, funded by the National Institute of Mental Health, and conducted in collaboration with Clinical Directors’ Network and the Centering Healthcare Institute.
Valerie A. Earnshaw, Lisa Rosenthal, Jessica B. Lewis, Emily C. Stasko, Jonathan N. Tobin, Tené T. Lewis, Allecia E. Reid, Jeannette R. Ickovics. Maternal Experiences with Everyday Discrimination and Infant Birth Weight: A Test of Mediators and Moderators Among Young, Urban Women of Color. Annals of Behavioral Medicine, 2012; DOI: 10.1007/s12160-012-9404-3