Arquivo da tag: Biotecnologia

Bioengineering study finds two-cell mouse embryos already ‘talking’ about their future (Science Daily)

Date:

November 26, 2014

Source:

University of California – San Diego

Summary:

Bioengineers have discovered that mouse embryos are contemplating their cellular fates in the earliest stages after fertilization when the embryo has only two to four cells, a discovery that could upend the scientific consensus about when embryonic cells begin differentiating into cell types. Their research used single-cell RNA sequencing to look at every gene in the mouse genome.

141126094252-large

The research team used single-cell RNA-sequencing to measure every gene in the mouse genome at multiple stages of development to find differences in gene expression at precise stages. Credit: Art by Victor O. Leshyk provided courtesy of bioeningeering professor Sheng Zhong, UC San Diego Jacobs School of Engineering.

Bioengineers at the University of California, San Diego have discovered that mouse embryos are contemplating their cellular fates in the earliest stages after fertilization when the embryo has only two to four cells, a discovery that could upend the scientific consensus about when embryonic cells begin differentiating into cell types. Their research, which used single-cell RNA sequencing to look at every gene in the mouse genome, was published recently in the journal Genome Research. In addition, this group published a paper on analysis of “time-course”single-cell data which is taken at precise stages of embryonic development in the journal of Proceedings of the National Academy of Sciences.

“Until recently, we haven’t had the technology to look at cells this closely,” said Sheng Zhong, a bioengineering professor at UC San Diego Jacobs School of Engineering, who led the research. “Using single-cell RNA-sequencing, we were able to measure every gene in the mouse genome at multiple stages of development to find differences in gene expression at precise stages.”

The findings reveal cellular activity that could provide insight into where normal developmental processes break down, leading to early miscarriages and birth defects.

The researchers discovered that a handful of genes are clearly signaling to each other at the two-cell and four-cell stage, which happens within days after an egg has been fertilized by sperm and before the embryo has implanted into the uterus. Among the identified genes are several genes belonging to the WNT signaling pathway, well-known for their role in cell-cell communications.

The prevailing view until now has been that mammalian embryos start differentiating into cell types after they have proliferated into large enough numbers to form subgroups. According to the co-authors Fernando Biase and Xiaoyi Cao, when the first cell fate decision is made is an open question. The first major task for an embryo is to decide which cells will begin forming the fetus, and which will form the placenta.

The research was funded by the National Institutes of Health (DP2OD007417) and the March of Dimes Foundation.

Zhong’s research in the field of systems or network biology applies engineering principals to understand how biological systems function. For example, they developed analytical methods to predict personal phenotypes, which refer to the physical description of an individual ranging from eye and hair color to health and disposition, using an individual’s personal genome and epigenome. Epigenome refers to the chemical compounds in DNA that regulate gene expression and vary from person to person. Predicting phenotypes with genome and epigenome is an emerging area of research in the field of personalized medicine that scientists believe could provide new ways to predict and treat genetic disorders.

Story Source:

The above story is based on materials provided by University of California – San Diego. Note: Materials may be edited for content and length.

Journal References:

  1. F. H. Biase, X. Cao, S. Zhong. Cell fate inclination within 2-cell and 4-cell mouse embryos revealed by single-cell RNA sequencing. Genome Research, 2014; 24 (11): 1787 DOI: 10.1101/gr.177725.114
  2. W. Huang, X. Cao, F. H. Biase, P. Yu, S. Zhong. Time-variant clustering model for understanding cell fate decisions. Proceedings of the National Academy of Sciences, 2014; 111 (44): E4797 DOI: 10.1073/pnas.1407388111
Anúncios

Cientistas pedem limite à criação de vírus mortais em laboratório (O Globo)

JC e-mail 4991, de 17 de julho de 2014

Falhas em unidades americanas elevam riscos de surtos

Um grupo multidisciplinar de cientistas de importantes universidades em diferentes países publicou ontem um alerta sobre a manipulação, em laboratórios norte-americanos, de vírus que podem se espalhar e infectar homens e outros mamíferos. A preocupação vem na esteira de seguidas notícias sobre falhas de envolvendo micro-organismos potencialmente perigosos.

“Incidentes recentes com varíola, antraz e gripe aviária em alguns dos mais importantes laboratórios dos EUA nos faz lembrar da falibilidade até das unidades mais seguras, reforçando a necessidade urgente de uma reavaliação completa de biossegurança”, escreveu o autodenominado “Grupo de Trabalho de Cambridge”, composto de pesquisadores das universidades de Harvard, Yale, Ottawa, entre outras.

No alerta, eles relatam que incidentes com patógenos têm aumentado e ocorrido em média duas vezes por semana em laboratórios privados e públicos do país. A informação é de um estudo de 2012 do periódico “Applied Biosafety”.

– Quando vemos algum caso na imprensa, dá a impressão de que são episódios raros, mas não são – comentou Amir Attaran, da Universidade de Ottawa, um dos cientistas que assinou o documento. – Estamos preocupados com as experiências perigosas que estão sendo feitas para projetar os mais infecciosos e mortais vírus da gripe e da síndrome respiratória aguda grave (Sars). Achamos que essa ciência imprudente e insensata pode ferir ou matar um grande número de pessoas. O Centro de Controle de Prevenção de Doenças, na semana passada, admitiu que laboratórios de alta segurança perderam o controle com algumas amostras.

No último caso, frascos de varíola foram encontrados por acaso num depósito inutilizado de um laboratório federal em Washington. Estima-se que eles estivessem ali há mais de 50 anos.

Attaran comparou o pronunciamento do grupo ao que cientistas fizeram em 1943, antes dos bombardeios de Hiroshima, na Segunda Guerra Mundial. E disse que o risco dessas experiências são maiores do que os possíveis benefícios dessas pesquisas, citando a recriação in vitro do vírus da gripe espanhola, de 1918, que matou 40 milhões de pessoas. Em 2006, cientistas fizeram a experiência num laboratório americano.

– Não é para ficarmos alarmados aqui – garante Volnei Garrafa, coordenador do Programa de Pós-Graduação em Bioética da UnB e membro do Comitê Internacional de Bioética da Unesco. – Mas a preocupação deles é válida, sempre há riscos, e o governo precisaria se posicionar.

LEIA O DOCUMENTO NA ÍNTEGRA:
Incidentes recentes envolvendo varíola, antraz e gripe aviária em alguns dos mais importantes laboratórios dos Estados Unidos nos faz lembrar da falibilidade até das unidades mais seguras, reforçando a necessidade urgente de uma reavaliação completa de biossegurança. Tais incidentes têm aumentado e ocorrido em média duas vezes por semana com patógenos regulados em laboratórios privados e públicos do país. Uma infecção acidental com qualquer patógeno é preocupante. Mas riscos de acidente com os recém-criados ‘patógenos potencialmente pandêmicos’ levanta novas graves preocupações.

A criação em laboratório de novas cepas de vírus perigosos e altamente transmissíveis, especialmente de gripe, mas não apenas dela, apresenta riscos substancialmente maiores. Uma infecção acidental em tal situação poderia desencadear surtos que poderiam ser difíceis ou impossíveis de controlar. Historicamente, novas cepas de gripe, uma vez que comecem a transmissão na população humana, infectaram um quarto ou mais da população mundial em dois anos.

Para qualquer experimento, os benefícios esperados deveriam superar os riscos. Experiências envolvendo a criação de patógenos potencialmente pandêmicos deveria ser limitada até que haja uma avaliação quantitativa, objetiva e confiável, dos possíveis benefícios e oportunidades de mitigação de riscos, assim como a comparação contra abordagens experimentais mais seguras.

Uma versão moderna do processo Asilomar, que define regras para pesquisas com DNA recombinante, poderia ser um ponto de partida para identificar as melhores medidas para se atingir os objetivos de saúde pública global no combate a doenças pandêmicas e assegurar os mais altos níveis de segurança. Sempre que possível, a segurança deve ser prioridade em detrimento a ações que tenham risco de pandemia acidental.

(Flávia Milhorance / O Globo)
http://oglobo.globo.com/sociedade/saude/cientistas-pedem-limite-criacao-de-virus-mortais-em-laboratorio-13281731#ixzz37jNAiHZI

Mosquito transgênico para controle da dengue aprovado pela CTNBio (Portal do Meio Ambiente)

17 ABRIL 2014

Brasília – A CTNbio aprovou o pedido de liberação comercial de uma variedade transgênica de Aedes aegypti (o mosquito transmissor do vírus da dengue e de um novo virus, Chikungunya), desenvolvido pela empresa britânica Oxitec. O A. aegypti OX513a carrega um gene de letalidade condicional, que é ativado na ausência de tetraciclina. Os machos, separados das fêmeas ainda em estado de pupa, podem ser produzidos em biofábrica em enormes quantidades, sendo em seguida liberados no ambiente. Para detalhes verhttp://br.oxitec.com .

A votação nominal na Plenária teve como resultado 16 votos favoráveis (sendo um condicional) e um contra.

Antes da votação o parecer de vistas do processo foi lido. O membro relator argumentou pela diligência do processo por várias falhas que, ao seu ver, impediam uma conclusão segura do parecer. O argumento principal foi de que a eliminação do A. aegypti, de forma rápida e extensa, abriria espaço para a recolonização do espaço por outro mosquito, como o Aedes albopictus. Seu parecer foi amplamente rechaçado pela Comissão.

Também antes da votação alguns membros sugeriram uma audiência pública de instrução, que foi rechaçada por 11 votos contra 4.

A discussão imediatamente antes da votação versou menos sobre os riscos diretos do mosquito à saúde humana e animal e ao meio ambiente e derivou para aspectos de benefícios à tecnologia. Esta divergência refletiu o consenso da CTNBio quanto à segurança do produto e à premência de novas técnicas para o controle do vetor da dengue. A discussão também refletiu a segurança da CTNBio sobre o potencial da tecnologia na redução de populações de A. aegypti, sem riscos de recrudescimento de outras doenças, parecimento de novas endemias ou substituição do mosquito vetor, em completa oposição ao ponto de vista isolado do membro relator do pedido de vistas. Uma discussão detalhada do ponto de vista do relator está disponível em http://goo.gl/7aJZuI.

Com estes resultados,a CTNBio abre ao país a possibilidade de empregar um mosquito transgênico para o controle da dengue. A liberação comercial deste mosquito é, também, a primeira liberação comercial de um inseto transgênico no Mundo. O Brasil, usando uma legislação eficiência e séria na avaliação de risco de organismos geneticamente modificados, dá um exemplo de seriedade e maturidade tanto aos países que já fazem avaliação de risco de OGMs, como àqueles que ainda vacilam em ingressar no uso desta tecnologia.

Fonte: GenPeace.

*   *   *

17/4/2014 – 12h13

Mosquitos transgênicos são aprovados, mas pesquisadores temem riscos (Adital)

por Mateus Ramos, do Adital

mosquitos1 300x150 Mosquitos transgênicos são aprovados, mas pesquisadores temem riscos

Um importante, e perigoso, passo foi dado na última semana pela Comissão Técnica Nacional de Biossegurança (CTNBio), que aprovou o projeto de liberação de mosquitos geneticamente modificados no Brasil. Os mosquitos transgênicos serão usados para pesquisa e combate a dengue no país. O projeto, que permite a comercialização dos mosquitos pela empresa britânica Oxitec, foi considerado tecnicamente seguro pela CTNBio e, agora, só necessita de um registro da Agência Nacional de Vigilância Sanitária (Anvisa) para ser, de fato, liberado.

Para o professor da Universidade Federal de São Carlos (SP) e ex- membro da CTNBio, José Maria Ferraz, em entrevista à Adital, a resposta positiva dada ao projeto, pela Comissão, é um forte indicativo de que o mesmo será feito pela Anvisa. “Com certeza será aprovado, o próprio representante do Ministério da Saúde estava lá e disse que, frente às epidemias de dengue, era favorável à aprovação do projeto.”

Ferraz faz duras críticas à aprovação concedida pela CTNBio e ao projeto. “Não existe uma só política de enfrentamento à dengue, mas sim um conjunto de ações, além disso, não há garantias de que os mosquitos liberados também não carreguem a doença, ou seja, vão liberar milhões de mosquitos em todo o país, sem antes haver um estudo sério sobre o projeto. É uma coisa extremamente absurda o que foi feito. É uma insanidade, eu nunca vi tanta coisa errada em um só projeto.”

Outro grande problema apontado por Ferraz é o risco de se alterar, drasticamente, o número de mosquitos Aedes Aegypti. Uma possível redução pode aumentar a proliferação de outro mosquito, ainda mais nocivo, o Aedes Albopictus, que transmite não só a Dengue como outras doenças, a Malária por exemplo. Além disso, ele denuncia que falhas no projeto podem desencadear ainda a liberação de machos não estéreis e fêmeas, dificultando o controle das espécies. “O país está sendo cobaia de um experimento nunca feito antes no mundo. Aprovamos esse projeto muito rápido, de forma irresponsável.”

Os resultados prometidos pelo projeto podem ser afetados, por exemplo, caso haja o contato do mosquito com o antibiótico tetraciclina, que é encontrado em muitas rações para gatos e cachorros. “Basta que os mosquitos entrem em contato com as fezes dos animais alimentados com a ração que contenham esse antibiótico para que todo o experimento falhe.”, revela Ferraz.

Entenda o projeto

De acordo com a Oxitec, a técnica do projeto consiste em introduzir dois novos genes em mosquitos machos, que, ao copularem com as fêmeas do ambiente natural, gerariam larvas incapazes de chegar à fase adulta, ou seja, estas não chegariam à fase em que podem transmitir a doença aos seres humanos. Além disso, as crias também herdariam um marcador que as torna visíveis sob uma luz específica, facilitando o seu controle.

* Publicado originalmente no site Adital.

Important and complex systems, from the global financial market to groups of friends, may be highly controllable (Science Daily)

Date: March 20, 2014

Source: McGill University

Summary: Scientists have discovered that all complex systems, whether they are found in the body, in international finance, or in social situations, actually fall into just three basic categories, in terms of how they can be controlled.

All complex systems, whether they are found in the body, in international finance, or in social situations, actually fall into just three basic categories, in terms of how they can be controlled, researchers say. Credit: © Artur Marciniec / Fotolia

We don’t often think of them in these terms, but our brains, global financial markets and groups of friends are all examples of different kinds of complex networks or systems. And unlike the kind of system that exists in your car that has been intentionally engineered for humans to use, these systems are convoluted and not obvious how to control. Economic collapse, disease, and miserable dinner parties may result from a breakdown in such systems, which is why researchers have recently being putting so much energy into trying to discover how best to control these large and important systems.

But now two brothers, Profs. Justin and Derek Ruths, from Singapore University of Technology and Design and McGill University respectively, have suggested, in an article published in Science, that all complex systems, whether they are found in the body, in international finance, or in social situations, actually fall into just three basic categories, in terms of how they can be controlled.

They reached this conclusion by surveying the inputs and outputs and the critical control points in a wide range of systems that appear to function in completely different ways. (The critical control points are the parts of a system that you have to control in order to make it do whatever you want — not dissimilar to the strings you use to control a puppet).

“When controlling a cell in the body, for example, these control points might correspond to proteins that we can regulate using specific drugs,” said Justin Ruths. “But in the case of a national or international economic system, the critical control points could be certain companies whose financial activity needs to be directly regulated.”

One grouping, for example, put organizational hierarchies, gene regulation, and human purchasing behaviour together, in part because in each, it is hard to control individual parts of the system in isolation. Another grouping includes social networks such as groups of friends (whether virtual or real), and neural networks (in the brain), where the systems allow for relatively independent behaviour. The final group includes things like food systems, electrical circuits and the internet, all of which function basically as closed systems where resources circulate internally.

Referring to these groupings, Derek Ruths commented, “While our framework does provide insights into the nature of control in these systems, we’re also intrigued by what these groupings tell us about how very different parts of the world share deep and fundamental attributes in common — which may help unify our understanding of complexity and of control.”

“What we really want people to take away from the research at this point is that we can control these complex and important systems in the same way that we can control a car,” says Justin Ruths. “And that our work is giving us insight into which parts of the system we need to control and why. Ultimately, at this point we have developed some new theory that helps to advance the field in important ways, but it may still be another five to ten years before we see how this will play out in concrete terms.”

Journal Reference:

  1. Justin Ruths and Derek Ruths. Control Profiles of Complex NetworksScience, 2014 DOI: 10.1126/science.1242063

Fronteiras da biotecnologia (O Estado de S.Paulo)

JC e-mail 4872, de 10 de dezembro de 2013

Artigo de Xico Graziano publicado no Estadão

Plantas transgênicas vieram para ficar. E prevalecer. Suas variedades passaram a dominar a safra de grãos no Brasil. Na corrida tecnológica, ninguém segura a engenharia genética. A ciência vence o medo obscurantista.

Lavouras geneticamente modificadas de soja, milho e algodão, nessa ordem, lideram, com dois terços, a semeadura da área nacional. Produtividade, facilidade no trato, economia de defensivos: aqui as razões principais que explicam seu notável desempenho. Problemas agronômicos, como resistência de ervas invasoras a herbicidas ou ressurgência de pragas, existem, mas se assemelham aos das lavouras convencionais. Não se comprovou alguma tragédia ambiental, tampouco dano à saúde humana, decorrente do uso específico de transgênicos.

Há séculos o melhoramento genético tradicional tem modificado os organismos. As variedades atualmente plantadas ou criadas pouco se parecem com suas ancestrais: o frango deixou de ser caipira, o milho tornou-se ereto, as frutas perdem suas sementes. Nenhum alimento continua “natural”. O patamar da evolução mudou, porém, quando os cientistas descobriram a possibilidade de modificar artificialmente o DNA das espécies. Sem cruzamento sexual.

Tudo começou em 1972. Pesquisadores perceberam que parasitas do gênero Agrobacterium transferiam partes de seu germoplasma para as plantas hospedeiras, estimulando nestas a produção de açúcar, do qual se alimentavam. Quer dizer, ocorria na natureza um mecanismo de transgenia. Dez anos depois, em Gent (Bélgica), cientistas conseguiram pioneiramente efetuar a transgênese em laboratório. Em seguida, certas bactérias foram geneticamente modificadas visando à produção de insulina humana. Os diabéticos comemoraram. A ciência havia dado um tremendo salto no conhecimento.

Desde então as equipes de ponta, em oficinas públicas e privadas, passaram a investir na engenharia genética, turbinando mundialmente a biotecnologia. Esta se destacou, inicialmente, na manipulação de microrganismos. Depois, em 1996, chegou ao campo, com o lançamento de uma variedade de soja resistente à aplicação de herbicida. Começou a grande polêmica. Ativistas ambientais denunciaram a “comida Frankenstein”. Religiosos condenaram os cientistas por manipularem a vida. A opinião pública ficou confusa.

Tal temor, compreensível, resultou na proposta de uma “moratória” de cinco anos, precaução adotada pela União Europeia em 1999. Esse período se considerava suficiente para buscar o esclarecimento das dúvidas sobre a nova tecnologia. O tempo passou, a engenharia genética evoluiu, os preconceitos religiosos e ideológicos cederam lugar às evidências científicas. Novas transgenias surgiram, barreiras foram caindo. Hoje, na agricultura, as variedades modernas, geneticamente alteradas, se fazem presentes em 50 países, plantadas por 17,3 milhões de agricultores, ocupando 10% da terra arável do mundo. Não é mais uma experiência.

Novidades biotecnológicas continuam surgindo. Entre animais, desenvolvem-se cabras transgênicas que produzem em seu leite uma proteína típica da teia de aranha, capaz de gerar polímeros altamente resistentes. Nos vegetais, entusiasma a possibilidade da geração de plantas que suportam “stress hídrico”. Na Embrapa, um gene de cafeeiros resistentes à seca foi introduzido em plantas de fumo, fazendo-as suportar a falta de água no solo. Em Israel, cientistas do Instituto de Tecnologia alteraram os genes de alface, impedindo que suas folhas murchem após a colheita. Sensacional.

Técnicas chamadas “DNA recombinante” invadem a medicina. Utilizando-as, o Instituto Butantã (São Paulo) desenvolveu recente vacina contra a hepatite B; também pela intervenção no genoma viral surgem vacinas contra influenza, dengue, coqueluche e tuberculose. Na Faculdade de Medicina da USP em Ribeirão Preto estuda-se uma vacina transgênica para combater câncer. Porcos geneticamente modificados em Munique (Alemanha) provocaram fraca reação do sistema imunológico humano, abrindo caminho para os xenotransplantes.

Bactérias, leveduras e fungos geneticamente modificados têm sido utilizados na fabricação de alimentos há tempos. Esses microrganismos atuam diretamente nos processos de fermentação, gerando queijos, massas, cerveja; ajudam até na definição do aroma em bebidas e comidas. Etanol celulósico, a partir do bagaço da cana ou de capim, virá de leveduras geneticamente modificadas. Na indústria, o sabão em pó contêm enzimas, oriundas de bactérias transgênicas, que facilitam a degradação de gordura nos tecidos.

Na fronteira da biotecnologia desenvolve-se aqui, na Embrapa, uma incrível técnica – dos promotores constitutivos – capaz de restringir a manifestação de certas proteínas transgênicas em folhas e frutos das plantas modificadas. Ou seja, a planta será transgênica, mas seus frutos, ou grãos, escapam do DNA alterado. O avanço da engenharia genética, base da biotecnologia, é extraordinário em todos os ramos, dando a impressão de que o melhor ainda está por vir.

Por que, então, diante de tanto sucesso ainda há restrições contra os transgênicos, taxando-os de produtos do mal? Boa pergunta. A resposta encontra-se no preconceito criado lá atrás. A rigor, hoje em dia os produtos transgênicos, submetidos a legislação super-rigorosa, são bastante seguros para o consumo. Já outros alimentos, embora “convencionais”, mais parecem uma bomba química: salgadinhos, latarias, maioneses, doces insossos, essas gororobas, sim, impunemente destroem nossa saúde.

Conclusão: transgênico ou convencional, pouco importa. Vale o alimento ser saudável.

Xico Graziano é agrônomo, foi secretário de Agricultura e secretário do Meio Ambiente do Estado de São Paulo

http://www.estadao.com.br/noticias/impresso,fronteiras–da-biotecnologia-,1106577,0.htm

Life span of humans took a huge jump in past century (MSNBC)

Researchers credit environmental improvements, not genetics, for the increaseBy Trevor Stokes

updated 10/15/2012 7:11:26 PM ET

Humans are living longer than ever, a life-span extension that occurred more rapidly than expected and almost solely from environmental improvements as opposed to genetics, researchers said Monday.

Four generations ago, the average Swede had the same probability of dying as a hunter-gatherer, but improvements in our living conditions through medicine, better sanitation and clean drinking water (considered “environmental” changes) decreased mortality rates to modern levels in just 100 years, researchers found.

In Japan, 72 has become the new 30, as the likelihood of a 72-year-old modern-day person dying is the same as a 30-year-old hunter-gatherer ancestor who lived 1.3 million years ago. Though the researchers didn’t specifically look at the United States, they say the trends are not country-specific and not based in genetics.

Quick jump in life span
The same progress of decreasing average probability of dying at a certain age in hunters-gatherers that took 1.3 million years to achieve was made in 30 years during the 21st century.

“I pictured a more gradual transition from a hunter-gatherer mortality profile to something like we have today, rather than this big jump, most of which occurred in the last four generations, to me that was surprise,” lead author Oskar Burger, postdoctoral fellow at the Max Planck Institute for Demographic Research in Germany, told LiveScience.

Biologists have lengthened life spans of worms, fruit flies and mice in labs by selectively breeding for old-age survivorship or tweaking their endocrine system, a network of glands that affects every cell in the body. However, the longevity gained in humans over the past four generations is even greater than can be created in labs, researchers concluded. [Extending Life: 7 Ways to Live Past 100]

Genetics vs. environment
In the new work, Burger and colleagues analyzed previously published mortality data from Sweden, France and Japan, from present-day hunter-gatherers and from wild chimpanzees, the closet living relative to humans.

Humans have lived for an estimated 8,000 generations, but only in the past four have mortalities decreased to modern-day levels. Hunter-gatherers today have average life spans on par with wild chimpanzees.

The research suggests that while genetics plays a small role in shaping human mortality, the key in driving up our collective age lies with the advent of medical technologies, improved nutrition, higher education, better housing and several other improvements to the overall standards of living.

“This recent progress has been just astronomically fast compared to what we made since the split from chimpanzees,” Burger said.

Most of the brunt of decreased mortality comes in youth: By age 15, hunters and gatherers have more than 100 times the chance of dying as modern-day people.

What’s next?
“In terms of what’s going on in the next four generations, I want to be very clear that I don’t make any forecasts,” Burger said. “We’re in a period of transition and we don’t know what the new stable point will be.”

However, some researchers say that humans may have maxed out their old age.

“These mortality curves (that show the probability of dying by a certain age), they are now currently at their lowest possible value, which makes a very strong prediction that life span cannot increase much more,” Caleb Finch, a neurogerontology professor at the University of Southern California who studies the biological mechanisms of aging, told LiveScience in an email.

Further, Finch, who was not involved in the current study, argues that environmental degradation, including climate change and ozone pollution, combined with increased obesity “are working to throw us back to an earlier phase of our improvements, they’re regressive.”

“It’s impossible to make any reasonable predictions, but you can look, for example, in local environments in Los Angeles where the density of particles in the air predict the rate of heart disease and cancer,” Finch said, illustrating the link between the environment and health.

The study was detailed Monday in the journal Proceedings of the National Academy of Sciences.

Researchers Produce First Complete Computer Model of an Organism (Science Daily)

ScienceDaily (July 21, 2012) — In a breakthrough effort for computational biology, the world’s first complete computer model of an organism has been completed, Stanford researchers reported last week in the journal Cell.

The Covert Lab incorporated more than 1,900 experimentally observed parameters into their model of the tiny parasite Mycoplasma genitalium. () (Credit: Illustration by Erik Jacobsen / Covert Lab)

A team led by Markus Covert, assistant professor of bioengineering, used data from more than 900 scientific papers to account for every molecular interaction that takes place in the life cycle of Mycoplasma genitalium, the world’s smallest free-living bacterium.

By encompassing the entirety of an organism in silico, the paper fulfills a longstanding goal for the field. Not only does the model allow researchers to address questions that aren’t practical to examine otherwise, it represents a stepping-stone toward the use of computer-aided design in bioengineering and medicine.

“This achievement demonstrates a transforming approach to answering questions about fundamental biological processes,” said James M. Anderson, director of the National Institutes of Health Division of Program Coordination, Planning and Strategic Initiatives. “Comprehensive computer models of entire cells have the potential to advance our understanding of cellular function and, ultimately, to inform new approaches for the diagnosis and treatment of disease.”

The research was partially funded by an NIH Director’s Pioneer Award from the National Institutes of Health Common Fund.

From information to understanding

Biology over the past two decades has been marked by the rise of high-throughput studies producing enormous troves of cellular information. A lack of experimental data is no longer the primary limiting factor for researchers. Instead, it’s how to make sense of what they already know.

Most biological experiments, however, still take a reductionist approach to this vast array of data: knocking out a single gene and seeing what happens.

“Many of the issues we’re interested in aren’t single-gene problems,” said Covert. “They’re the complex result of hundreds or thousands of genes interacting.”

This situation has resulted in a yawning gap between information and understanding that can only be addressed by “bringing all of that data into one place and seeing how it fits together,” according to Stanford bioengineering graduate student and co-first author Jayodita Sanghvi.

Integrative computational models clarify data sets whose sheer size would otherwise place them outside human ken.

“You don’t really understand how something works until you can reproduce it yourself,” Sanghvi said.

Small is beautiful

Mycoplasma genitalium is a humble parasitic bacterium known mainly for showing up uninvited in human urogenital and respiratory tracts. But the pathogen also has the distinction of containing the smallest genome of any free-living organism — only 525 genes, as opposed to the 4,288 of E. coli, a more traditional laboratory bacterium.

Despite the difficulty of working with this sexually transmitted parasite, the minimalism of its genome has made it the focus of several recent bioengineering efforts. Notably, these include the J. Craig Venter Institute’s 2008 synthesis of the first artificial chromosome.

“The goal hasn’t only been to understand M. genitalium better,” said co-first author and Stanford biophysics graduate student Jonathan Karr. “It’s to understand biology generally.”

Even at this small scale, the quantity of data that the Stanford researchers incorporated into the virtual cell’s code was enormous. The final model made use of more than 1,900 experimentally determined parameters.

To integrate these disparate data points into a unified machine, the researchers modeled individual biological processes as 28 separate “modules,” each governed by its own algorithm. These modules then communicated to each other after every time step, making for a unified whole that closely matched M. genitalium‘s real-world behavior.

Probing the silicon cell

The purely computational cell opens up procedures that would be difficult to perform in an actual organism, as well as opportunities to reexamine experimental data.

In the paper, the model is used to demonstrate a number of these approaches, including detailed investigations of DNA-binding protein dynamics and the identification of new gene functions.

The program also allowed the researchers to address aspects of cell behavior that emerge from vast numbers of interacting factors.

The researchers had noticed, for instance, that the length of individual stages in the cell cycle varied from cell to cell, while the length of the overall cycle was much more consistent. Consulting the model, the researchers hypothesized that the overall cell cycle’s lack of variation was the result of a built-in negative feedback mechanism.

Cells that took longer to begin DNA replication had time to amass a large pool of free nucleotides. The actual replication step, which uses these nucleotides to form new DNA strands, then passed relatively quickly. Cells that went through the initial step quicker, on the other hand, had no nucleotide surplus. Replication ended up slowing to the rate of nucleotide production.

These kinds of findings remain hypotheses until they’re confirmed by real-world experiments, but they promise to accelerate the process of scientific inquiry.

“If you use a model to guide your experiments, you’re going to discover things faster. We’ve shown that time and time again,” said Covert.

Bio-CAD

Much of the model’s future promise lies in more applied fields.

CAD — computer-aided design — has revolutionized fields from aeronautics to civil engineering by drastically reducing the trial-and-error involved in design. But our incomplete understanding of even the simplest biological systems has meant that CAD hasn’t yet found a place in bioengineering.

Computational models like that of M. genitalium could bring rational design to biology — allowing not only for computer-guided experimental regimes, but also for the wholesale creation of new microorganisms.

Once similar models have been devised for more experimentally tractable organisms, Karr envisions bacteria or yeast specifically designed to mass-produce pharmaceuticals.

Bio-CAD could also lead to enticing medical advances — especially in the field of personalized medicine. But these applications are a long way off, the researchers said.

“This is potentially the new Human Genome Project,” Karr said. “It’s going to take a really large community effort to get close to a human model.”

Stanford’s Department of Bioengineering is jointly operated by the School of Engineering and the School of Medicine.

Science, Journalism, and the Hype Cycle: My piece in tomorrow’s Wall Street Journal (Discovery Magazine)

I think one of the biggest struggles a science writer faces is how to accurately describe the promise of new research. If we start promising that a preliminary experiment is going to lead to a cure for cancer, we are treating our readers cruelly–especially the readers who have cancer. On the other hand, scoffing at everything is not a sensible alternative, because sometimes preliminary experiments really do lead to great advances. In the 1950s, scientists discovered that bacteria can slice up virus DNA to avoid getting sick. That discovery led, some 30 years later, to biotechnology–to an industry that enabled, among other things, bacteria to produce human insulin.

This challenge was very much on my mind as I recently read two books, which I review in tomorrow’s Wall Street Journal. One is on gene therapy–a treatment that inspired wild expectations in the 1990s, then crashed, and now is coming back. The other is epigenetics, which seems to me to be in the early stages of the hype cycle. You can read the essay in full here. [see post below]

March 9th, 2012 5:33 PM by Carl Zimmer

Hope, Hype and Genetic Breakthroughs (Wall Street Journal)

By CARL ZIMMER

I talk to scientists for a living, and one of my most memorable conversations took place a couple of years ago with an engineer who put electrodes in bird brains. The electrodes were implanted into the song-generating region of the brain, and he could control them with a wireless remote. When he pressed a button, a bird singing in a cage across the lab would fall silent. Press again, and it would resume its song.

I could instantly see a future in which this technology brought happiness to millions of people. Imagine a girl blind from birth. You could implant a future version of these wireless electrodes in the back of her brain and then feed it images from a video camera.

As a journalist, I tried to get the engineer to explore what seemed to me to be the inevitable benefits of his research. To his great credit, he wouldn’t. He wasn’t even sure his design would ever see the inside of a human skull. There were just too many ways for it to go wrong. He wanted to be very sure that I understood that and that I wouldn’t claim otherwise. “False hope,” he warned me, “is a sinful thing.”

EPEGINE1

Stephen Voss. Gene therapy allowed this once-blind dog to see again.

Over the past two centuries, medical research has yielded some awesome treatments: smallpox wiped out with vaccines, deadly bacteria thwarted by antibiotics, face transplants. But when we look back across history, we forget the many years of failure and struggle behind each of these advances.

This foreshortened view distorts our expectations for research taking place today. We want to believe that every successful experiment means that another grand victory is weeks away. Big stories appear in the press about the next big thing. And then, as the years pass, the next big thing often fails to materialize. We are left with false hope, and the next big thing gets a reputation as the next big lie.

In 1995, a business analyst named Jackie Fenn captured this intellectual whiplash in a simple graph. Again and again, she had seen new advances burst on the scene and generate ridiculous excitement. Eventually they would reach what she dubbed the Peak of Inflated Expectations. Unable to satisfy their promise fast enough, many of them plunged into the Trough of Disillusionment. Their fall didn’t necessarily mean that these technologies were failures. The successful ones slowly emerged again and climbed the Slope of Enlightenment.

When Ms. Fenn drew the Hype Cycle, she had in mind dot-com-bubble technologies like cellphones and broadband. Yet it’s a good model for medical advances too. I could point to many examples of the medical hype cycle, but it’s hard to think of a better one than the subject of Ricki Lewis’s well-researched new book, “The Forever Fix”: gene therapy.

The concept of gene therapy is beguilingly simple. Many devastating disorders are the result of mutant genes. The disease phenylketonuria, for example, is caused by a mutation to a gene involved in breaking down a molecule called phenylalanine. The phenylalanine builds up in the bloodstream, causing brain damage. One solution is to eat a low-phenylalanine diet for your entire life. A much more appealing alternative would be to somehow fix the broken gene, restoring a person’s metabolism to normal.

In “The Forever Fix,” Ms. Lewis chronicles gene therapy’s climb toward the Peak of Inflated Expectations over the course of the 1990s. A geneticist and the author of a widely used textbook, she demonstrates a mastery of the history, even if her narrative sometimes meanders and becomes burdened by clichés. She explains how scientists learned how to identify the particular genes behind genetic disorders. They figured out how to load genes into viruses and then to use those viruses to insert the genes into human cells.

EPEGINE2

Stephen Voss. Alisha Bacoccini is tested on her ability to read letters, at UPenn Hospital, in Philadelphia, PA on Monday, June 23, 2008. Bacoccini is undergoing an experimental gene therapy trial to improve her sight.

By 1999, scientists had enjoyed some promising successes treating people—removing white blood cells from leukemia patients, for example, inserting working genes, and then returning the cells to their bodies. Gene therapy seemed as if it was on the verge of becoming standard medical practice. “Within the next decade, there will be an exponential increase in the use of gene therapy,” Helen M. Blau, the then-director of the gene-therapy technology program at Stanford University, told Business Week.

Within a few weeks of Ms. Blau’s promise, however, gene therapy started falling straight into the Trough. An 18-year-old man named Jesse Gelsinger who suffered from a metabolic disorder had enrolled in a gene-therapy trial. University of Pennsylvania scientists loaded a virus with a working version of an enzyme he needed and injected it into his body. The virus triggered an overwhelming reaction from his immune system and within four days Gelsinger was dead.

Gene therapy nearly came to a halt after his death. An investigation revealed errors and oversights in the design of Gelsinger’s trial. The breathless articles disappeared. Fortunately, research did not stop altogether. Scientists developed new ways of delivering genes without triggering fatal side effects. And they directed their efforts at one part of the body in particular: the eye. The eye is so delicate that inflammation could destroy it. As a result, it has evolved physical barriers that keep the body’s regular immune cells out, as well as a separate battalion of immune cells that are more cautious in their handling of infection.

It occurred to a number of gene-therapy researchers that they could try to treat genetic vision disorders with a very low risk of triggering horrendous side effects of the sort that had claimed Gelsinger’s life. If they injected genes into the eye, they would be unlikely to produce a devastating immune reaction, and any harmful effects would not be able to spread to the rest of the body.

Their hunch paid off. In 2009 scientists reported their first success with gene therapy for a congenital disorder. They treated a rare form of blindness known as Leber’s congenital amaurosis. Children who were once blind can now see.

As “The Forever Fix” shows, gene therapy is now starting its climb up the Slope of Enlightenment. Hundreds of clinical trials are under way to see if gene therapy can treat other diseases, both in and beyond the eye. It still costs a million dollars a patient, but that cost is likely to fall. It’s not yet clear how many other diseases gene therapy will help or how much it will help them, but it is clearly not a false hope.

Gene therapy produced so much excitement because it appealed to the popular idea that genes are software for our bodies. The metaphor only goes so far, though. DNA does not float in isolation. It is intricately wound around spool-like proteins called histones. It is studded with caps made of carbon, hydrogen and oxygen atoms, known as methyl groups. This coiling and capping of DNA allows individual genes to be turned on and off during our lifetimes.

The study of this extra layer of control on our genes is known as epigenetics. In “The Epigenetics Revolution,” molecular biologist Nessa Carey offers an enlightening introduction to what scientists have learned in the past decade about those caps and coils. While she delves into a fair amount of biological detail, she writes clearly and compellingly. As Ms. Carey explains, we depend for our very existence as functioning humans on epigenetics. We begin life as blobs of undifferentiated cells, but epigenetic changes allow some cells to become neurons, others muscle cells and so on.

Epigenetics also plays an important role in many diseases. In cancer cells, genes that are normally only active in embryos can reawaken after decades of slumber. A number of brain disorders, such as autism and schizophrenia, appear to involve the faulty epigenetic programming of genes in neurons.

Scientists got their first inklings about epigenetics decades ago, but in the past few years the field has become hot. In 2008 the National Institutes of Health pledged $190 million to map the epigenetic “marks” on the human genome. New biotech start-ups are trying to carry epigenetic discoveries into the doctor’s office. The FDA has approved cancer drugs that alter the pattern of caps on tumor-cell DNA. Some studies on mice hint that it may be possible to treat depression by taking a pill that adjusts the coils of DNA in neurons.

People seem to be getting giddy about the power of epigenetics in the same way they got giddy about gene therapy in the 1990s. No longer is our destiny written in our DNA: It can be completely overwritten with epigenetics. The excitement is moving far ahead of what the science warrants—or can ever deliver. Last June, an article on the Huffington Post eagerly seized on epigenetics, woefully mangling two biological facts: one, that experiences can alter the epigenetic patterns in the brain; and two, that sometimes epigenetic patterns can be passed down from parents to offspring. The article made a ridiculous leap to claim that we can use meditation to change our own brains and the brains of our children—and thereby alter the course of evolution: “We can jump-start evolution and leverage it on our own terms. We can literally rewire our brains toward greater compassion and cooperation.” You couldn’t ask for a better sign that epigenetics is climbing the Peak of Inflated Expectations at top speed.

The title “The Epigenetics Revolution” unfortunately adds to this unmoored excitement, but in Ms. Carey’s defense, the book itself is careful and measured. Still, epigenetics will probably be plunging soon into the Trough of Disillusionment. It will take years to see whether we can really improve our health with epigenetics or whether this hope will prove to be a false one.

The Forever Fix

By Ricki LewisSt. Martin’s, 323 pages, $25.99

The Epigenetics Revolution

By Nessa CareyColumbia, 339 pages, $26.95

—Mr. Zimmer’s books include “A Planet of Viruses and Evolution: Making Sense of Life,” co-authored with Doug Emlen, to be published in July.