Arquivo da tag: ciência

E se a ciência provar que a vida continua após a morte? (Folha de S.Paulo)

Psiquiatra debate em SP se a consciência sobrevive ao corpo

6.out.2025 às 21h35; Atualizado: 7.out.2025 às 14h41

Juliano Spyer

A ideia exposta no filme “Blade Runner”, de robôs com consciência, parece que se aproxima da realidade.

No Vale do Silício, o tema já é tratado como uma religião pelo movimento conhecido como transumanismo. É um bom momento para discutir a relação entre espiritualidade e ciência.

O Brasil, especialmente o mundo popular, é profundamente religioso. Ainda assim, mesmo leitores desta coluna que acreditam em Deus podem estranhar a ideia de misturar fé e ciência. Mas por que não?
Se você acredita na existência de Deus e que a consciência é mais que o resultado de combinações aleatórias que produziram a vida, por que não pesquisar sobre isso?

Responder a essa pergunta se tornou a missão do psiquiatra brasileiro Alexander Moreira-Almeida (UFJF), fundador do Núcleo de Pesquisas em Espiritualidade e Saúde (Nupes). Seu trabalho, que investiga a relação entre espiritualidade e saúde mental, tem ganhado reconhecimento internacional.

Cena de “Blade Runner, O Caçador de Androides” – Divulgação

Neste ano, ele recebeu o Prêmio Oskar Pfister, concedido anualmente pela American Association of Psychiatry (APA) a pesquisadores que estudam temas na intersecção entre ciência e religião. O neurologista Oliver Sacks, o historiador Peter Gay e o filósofo Paul Ricoeur estão entre os agraciados de edições anteriores.

A força do trabalho de Alexander está no fato de ele produzir ciência dentro da universidade, com metodologia médica e atuação internacional. Ele foi coordenador da seção de saúde mental e espiritualidade da Associação Mundial de Psiquiatria e também se dedica à divulgação científica, publicando conteúdo acessível ao público não especializado.

No livro “Ciência da Vida após a Morte” (Springer, 2022), feito com dois coautores, ele alerta para a influência de uma ideologia dominante —o fisicalismo materialista— que considera a espiritualidade uma fantasia humana, embora a ciência jamais tenha provado que a consciência morre junto com o corpo físico.

A segunda parte da obra apresenta evidências empíricas que sugerem a possibilidade de sobrevivência dessa consciência. Entre os estudos analisados estão pesquisas sobre mediunidade, experiências de quase-morte e reencarnação.

Curioso? Na próxima semana você poderá fazer perguntas diretamente a Alexander. Ele participará de um debate com o diretor de Redação desta Folha, o jornalista Sérgio Dávila, e com a psicóloga Marta Helena de Freitas, professora da Universidade Católica de Brasília e presidente da International Association for the Psychology of Religion (IAPR).

O evento faz parte da série Conversas Difíceis, da qual sou curador. Será na segunda-feira (13), às 19h, no espaço Civi-co (rua Dr. Virgílio de Carvalho Pinto, 445 – Pinheiros, SP). A entrada é gratuita, mas as vagas são limitadas. Haverá transmissão ao vivo pelo canal do YouTube do Instituto Humanitas360.

O que mudaria se a ciência admitisse —e eventualmente comprovasse— que a vida não termina com o corpo? Como isso afetaria nossa visão de mundo e o modo como escolhemos viver e morrer?

Apareça. Leve sua curiosidade.

“Estamos ultrapassando seis dos nove limites planetários”, alerta cientista Johan Rockström (Um Só Planeta)

O cientista sueco Johan Rockström, diretor do Instituto Potsdam para Pesquisa de Impacto Climático (PIK), é reconhecido mundialmente por ter desenvolvido a estrutura dos limites planetários

Por Naiara Bertão

Um Só Planeta — São Paulo

28/08/2025

cientista sueco Johan Rockström, diretor do Instituto Potsdam para Pesquisa de Impacto Climático (PIK),
cientista sueco Johan Rockström, diretor do Instituto Potsdam para Pesquisa de Impacto Climático (PIK), — Foto: Naiara Bertão / Um Só Planeta

O cientista sueco Johan Rockström, diretor do Instituto Potsdam para Pesquisa de Impacto Climático (PIK), voltou a chamar atenção para os riscos que a humanidade corre ao avançar sobre os limites ambientais que garantem a estabilidade da Terra. Reconhecido mundialmente por ter desenvolvido a estrutura dos limites planetários em 2009, Rockström afirmou que já estamos numa situação perigosa, em que a própria sobrevivência de sociedades humanas complexas está em jogo.

O cientista participou nesta terça-feira (26) do encontro Futuro Vivo, evento organizado pela empresa de telecomunicações Vivo com o objetivo de ser um espaço de debate sobre os limites da tecnologia e de como desenvolver soluções sustentáveis para o meio ambiente.

Os limites planetários mostram exatamente os espaços seguros para um planeta estável — Foto: Divulgação/Netflix
Os limites planetários mostram exatamente os espaços seguros para um planeta estável — Foto: Divulgação/Netflix

Na palestra, ele retomou o conceito dos nove limites planetários que regulam o funcionamento da Terra para alertar a todos sobre os riscos que a humanidade corre ao ultrapassar os limites ambientais que garantem a estabilidade do planeta.

“Estamos começando a atingir o teto dos processos biofísicos que regulam a resiliência, a estabilidade e a habitabilidade da Terra”, disse em sua palestra.

“Seja em São Paulo, em Estocolmo ou em Pequim, o que acontece em diferentes partes do planeta interage e influencia a estabilidade de todo o sistema climático, da hidrologia e do suporte à vida na Terra. É por isso que precisamos definir um espaço operacional seguro para o desenvolvimento humano no planeta.”

A teoria dos limites planetários definiu estes princípios: clima, biodiversidade, uso da terra, ciclos de nitrogênio e fósforo, recursos hídricos, oceanos, poluição do ar, camada de ozônio e poluentes químicos. “O grande avanço científico não foi apenas identificá-los, mas quantificá-los”, explicou.

Segundo o cientista, a noção de que era possível explorar recursos sem limites ficou no passado. “Há 50 anos, não precisávamos disso. Hoje, ocupamos o planeta inteiro e não há mais espaço para sermos insustentáveis.”

Logo no início de sua palestra, Rockström lembrou que o planeta atravessou, nos últimos 10 mil anos, o período mais estável de sua história recente: o Holoceno. Foi nessa era que surgiram a agricultura e as civilizações humanas, sustentadas por condições climáticas e ecológicas favoráveis. “O Holoceno é o único estado do planeta que sabemos com certeza ser capaz de sustentar nossa vida. É o que eu chamo de Jardim do Éden”, afirmou.

Seca histórica ameaça valiosas colheitas na Califórnia, maior produtora de amêndoas no mundo — Foto: Justin Sullivan / Getty Images
Seca histórica ameaça valiosas colheitas na Califórnia, maior produtora de amêndoas no mundo — Foto: Justin Sullivan / Getty Images

Contudo, essa estabilidade está sendo rompida com a ascensão do Antropoceno, a era em que o ser humano é a principal força de mudança no planeta. “O sistema econômico global está no banco do motorista, superando os impactos de erupções vulcânicas, variações solares e terremotos. Essas forças naturais ainda existem, mas nós as dominamos e até as sobrepujamos.”

Para Rockström, a pressão sobre os sistemas naturais pode levar a mudanças abruptas e irreversíveis.

“O planeta é um sistema complexo e auto-adaptativo, que tem pontos de inflexão. Se empurrarmos demais, a Amazônia, a Groenlândia ou os recifes de coral podem colapsar e passar para estados que deixarão de nos sustentar. Esses pontos de virada não apenas reduzem a resiliência dos ecossistemas, mas também ameaçam diretamente economias e sociedades.”

Para o cientista, os dados não deixam dúvidas. “Estamos em uma situação perigosa. Estamos ameaçando a saúde de todo o planeta.” Ele explica que foram definidas zonas seguras, zonas de incerteza e zonas de alto risco na metodologia dos limites planetários.

“O problema é que, em 2023, já mostramos que seis desses nove limites estão sendo ultrapassados — clima, biodiversidade, mudanças no uso da terra, consumo de água doce, excesso de nitrogênio e fósforo, e a enorme carga de substâncias químicas no sistema terrestre.”

Sobrevoo do Greenpeace mostra a expansão do garimpo na terra Yanomami em 2021 — Foto: Christian Braga/Greenpeace
Sobrevoo do Greenpeace mostra a expansão do garimpo na terra Yanomami em 2021 — Foto: Christian Braga/Greenpeace

Essa constatação tem relação direta com o debate sobre políticas públicas no Brasil e no mundo. A Amazônia, por exemplo, é um dos sistemas mais próximos de um ponto de inflexão — quando mudanças irreversíveis podem ser desencadeadas. “O planeta é um sistema complexo e auto-adaptativo, que tem pontos de inflexão. Se empurrarmos demais, a Amazônia, a Groenlândia ou os recifes de coral podem colapsar e passar para estados que deixarão de nos sustentar”, alertou.

Apesar do alerta, o cientista vê na pesquisa uma ferramenta de esperança. Desde 2009, a metodologia dos limites planetários foi refinada e hoje já permite oferecer parâmetros para políticas públicas e decisões empresariais. “Hoje conseguimos oferecer à humanidade um mapa de navegação do Antropoceno. Definimos as fronteiras seguras para o futuro da vida na Terra. Isso nos dá a possibilidade de sermos responsáveis em escala planetária”, disse.

Para Rockström, reconhecer esses limites não é apenas uma questão científica, mas de sobrevivência. “Estamos ameaçando a saúde de todo o planeta. Esse é o diagnóstico da ciência, e ele deve servir como base para qualquer estratégia de desenvolvimento daqui para frente.”

A boa notícia, diz, é que já temos as soluções e já sabemos o que deve ser feito. Seguir o Acordo de Paris e buscar frear o aquecimento do planeta em 1,5ºC é primordial e, segundo ele, é possível. Mas o ritmo de mudanças precisa acelerar urgentemente.

Papel da política internacional e da COP30

A fala de Rockström chega em um momento estratégico: o Brasil se prepara para sediar a COP30, em Belém (PA) em novembro. A conferência deve ser marcada pelo foco em florestas tropicais e na transição justa para países em desenvolvimento. O conceito dos limites planetários, cada vez mais adotado por governos e empresas, oferece um “mapa de navegação” para esse processo.

“Hoje conseguimos oferecer à humanidade um mapa de navegação do Antropoceno. Definimos as fronteiras seguras para o futuro da vida na Terra. Isso nos dá a possibilidade de sermos responsáveis em escala planetária”, disse.

Para especialistas, integrar esse tipo de ciência ao processo político será crucial para que a COP30 avance em compromissos concretos — especialmente em temas como desmatamento zero, proteção da biodiversidade e financiamento climático.

“Estamos ameaçando a saúde de todo o planeta. Esse é o diagnóstico da ciência, e ele deve servir como base para qualquer estratégia de desenvolvimento daqui para frente”, concluiu Rockström.

Economistas ainda pensam em crescimento eterno, diz José Eli da Veiga (Folha de S.Paulo)

Professor da USP defende noção de crescer decrescendo e afirma que COP30 pode ser a mais difícil de todas; leia trechos

‘Rio Comprido VIII’ (2019), monotipia de Luiz Zerbini – Pat Kilgore/Divulgação

10.mai.2025 às 7h00 – artigo original

Eduardo Sombini

Doutor em geografia pela Unicamp, é repórter da Ilustríssima

Nem abandonar a ideia de crescimento econômico nem confiar nela cegamente.

José Eli da Veiga, professor sênior do Instituto de Estudos Avançados da USP, recorre a essa dupla negativa para sintetizar sua análise em “O Antropoceno e o Pensamento Econômico” (Editora 34), terceiro volume de sua trilogia sobre as ciências e as humanidades em um período de crise climática e transformação acelerada do planeta pela sociedade.

No livro, o intelectual revisita escolas e pensadores à margem do mainstream da economia para sustentar que a disciplina não acompanhou o avanço da fronteira do conhecimento e ainda passa ao largo, por exemplo, da teoria da evolução e da física moderna.

Em razão disso, Veiga argumenta, o pensamento econômico ignora os fluxos de energia e matéria envolvidos no processo de produção, o que faz com que economistas concebam um crescimento eterno e não se preocupem com as condições de vida das gerações futuras.

José Eli da Veiga, autor de ‘O Antropoceno e o Pensamento Econômico’, durante o seminário USP Pensa Brasil – Cecília Bastos – 4.out.23/USP Imagens

Na entrevista, o pesquisador fala sobre as ideias de crescer decrescendo e decrescer crescendo, um caminho do meio entre manter o modelo atual e as propostas de decrescimento da economia.

Veiga também discute o impasse em fóruns multilaterais dedicados à crise ambiental, como as COPs. Para ele, negociações entre as corporações e os governos responsáveis pela maior parte das emissões de gases do efeito estufa teriam mais resultado que encontros anuais com a participação de mais de uma centena de países.

Leia abaixo os trechos principais da entrevista. A íntegra está disponível em áudio.

O pensamento econômico hoje

Existe uma corrente muito secundária, vista pelos economistas como uma coisa heterodoxa e estranha, a economia evolucionária. Tem uma muito forte, a economia institucional. Tem uma bem sólida, mas que não é muito reconhecida, a economia ecológica. Mas, se você perguntar como uma inteligência artificial classifica as várias correntes da economia, o risco é que nem apareçam essas que eu citei.

Porque as principais são aquelas que, no fundo, formam o currículo tradicional de um curso de economia: macro, micro, história do pensamento econômico, um pouco de história econômica. A formação de um economista é mais ou menos essa.

Será que uma humanidade —a economia não é uma ciência— precisa ser compatível com a física e com a biologia, para não falar de química e geociências? Minha tendência é dizer que é errado ser incompatível.

Tem ramos da economia que avançaram muito, principalmente aqueles afeitos à modelização matemática, mas a economia ainda hoje é absolutamente prisioneira da mecânica clássica e, principalmente, da ideia de equilíbrio. Ignora totalmente a termodinâmica, para começar. Você chega a conclusões muito diferentes a respeito de como pode ser o desenvolvimento se levar em conta ou não a termodinâmica.

O conceito de entropia

Uma das primeiras coisas com que um estudante de economia se defronta é um diagrama do fluxo circular, que explica como funciona o chamado sistema econômico. Não entra nada nem sai nada desse sistema. Ele ignora a entrada de energia —nós somos uma dádiva do Sol— e, principalmente, todos os resíduos, do outro lado, além da entropia.

O que interessa para um economista na questão da entropia? Quando se usa energia —e nós não fizemos outra coisa que procurar fontes de energia que nos dessem cada vez mais produtividade—, parte dessa energia se dissipa. Permanentemente, estamos perdendo uma boa parte da energia que mobilizamos.

A rigor, a longo prazo, você não pode pensar em crescimento econômico. Você tem que pensar que o futuro da humanidade ou o desenvolvimento vão ter que prescindir do crescimento. Essa é uma conclusão que choca um economista ortodoxo, tradicional. Para eles, é subentendido que o crescimento é uma coisa eterna.

A economia e a ética

A dicotomia entre a economia como ética e uma economia mais logística, que a gente normalmente chama de o lado engenheiro da economia, é bem antiga. Houve tentativas teóricas de dizer que a economia devia se limitar só a esse aspecto logístico e não entrar em nenhum tipo de consideração ética. Evidentemente, isso não é uma coisa que foi seguida pelos economistas, mesmo por economistas que eu classificaria como ortodoxos. Uma parte deles, ao contrário, é bem ligada em considerações éticas.

Para nós, isso é muito importante porque o aquecimento global —para não falar de todos os outros prejuízos ao meio ambiente que a gente vem causando pelo menos há uns 80 anos de forma muito intensa– coloca em questão as condições de vida das próximas gerações. Esse é um dilema ético para nós.

Resultados frustrantes das COPs

Uma das coisas chocantes é notar que a questão da camada de ozônio, que era complicadíssima, teve um arranjo de cooperação internacional que deu muito resultado. Por quê? Como foi o formato?

No início, só se juntaram os que mais eram responsáveis pelo assunto. Eram poucos países que tinham as empresas que faziam o estrago. A partir disso, paulatinamente, foram ganhando adesões à convenção. É difícil encontrar, pelo menos na área ambiental, outra convenção ou outro tratado que tenha tido tanto sucesso.

Quando, em 1988, se criou o IPCC, houve muita pressa, porque a Rio-92 estava marcada e ia ser uma coisa muito importante. Mais importante que mera pressa, havia a conjuntura internacional geopolítica desse período. Ainda se vivia muito daquele entusiasmo e otimismo que surgiu a partir da queda da União Soviética. Hoje, olhando com a facilidade de estar distante disso, parece uma coisa infantil imaginar que você poderia fazer uma assembleia anual de todos os países do mundo e chegar a algum tipo de decisão.

A Convenção do Clima criou uma arena para que houvesse disputas políticas das mais variadas. No início, era sempre o Sul querendo dizer que a culpa era do Norte e que eles tinham que pagar. Depois, foram encontrando algumas saídas e, no famoso Acordo de Paris de 2015, a ideia é que cada país vai determinar ele mesmo qual é a contribuição que pode dar. Isso foi um grande avanço.

Neutralidade de carbono

No meio disso, com um grupo de Oxford liderando, cientistas começaram a levantar a ideia de que existem emissões que podem ser, de certa forma, abatidas —quando, por exemplo, uma área desmatada é restaurada— e isso levou à ideia de compensação de carbono.

Foi um tremendo desserviço. Quer dizer, tinha um lado bom e um lado ruim. O lado bom é que muitas empresas que olhavam para a questão do aquecimento global e viam sempre como um sacrifício ter que reduzir emissões, de repente, falaram: “Bom, vamos também poder abater aquilo que a gente faz de positivo”. Isso deu um certo incentivo para que elas não simplesmente banissem a ideia do aquecimento global.

Por outro lado, as empresas que mais emitem acharam o máximo. “Comprar uns créditos de carbono do pessoal que restaura na Amazônia, se a gente for muito pressionado, senão vamos continuar emitindo”. O resultado? É só olhar o que aconteceu.

Do Acordo de Paris para cá, as emissões de CO2 equivalente não pararam de aumentar, em um ritmo que é difícil imaginar se seria diferente. O impacto dos créditos de carbono nem começou a fazer cócegas por enquanto. Conheço muitos colegas que acreditam que, por volta de 2050, haja neutralidade de carbono –quer dizer, que o aquecimento global vai continuar, mas que as emissões estariam sendo mais ou menos integralmente abatidas por esses descontos.

Quando olho os números, acho que o máximo que se pode dizer é que talvez seja um problema que tenha solução neste século, mas não vai ser desse jeito, com essa convenção.

Um novo modelo para as COPs

Do meu ponto de vista, o que pode melhor acontecer é que, em algum momento, esse mesmo sistema de COPs descubra que é preciso reconsiderar a própria convenção. Hoje, a gente sabe que 80% das emissões saem de 57 empresas que estão em 34 países.

Se você juntasse esses 34 países em vez de juntar mais de cem uma vez por ano, eles não demorariam para encontrar uma maneira de se comprometer com um esquema de redução. Por exemplo, o chamado “cap and trade”: você fixa uma meta de redução das emissões para o ano que vem e as empresas que tiverem conseguido atingir essa meta recebem créditos que poderão ser vendidos para aquelas que ainda não conseguiram. Um esquema desse tipo é o que funciona no mercado de carbono europeu.

Aos poucos, você teria muito mais resultados se o arranjo fosse só com esses 34 países ou essas 57 empresas —ou a parte deles que topasse. Se a convenção não fosse abolida, as COPs poderiam começar a ser reunidas de cinco em cinco anos. É um desperdício de tudo, de dinheiro, de energia. Essas COPs são uma coisa assustadora.

Expectativas para a COP30

Do ponto de vista das negociações diplomáticas, acho que vai ser praticamente mais do mesmo. Sempre aparece alguma coisa que você pode usar para dizer que foi um avanço, mas, no frigir dos ovos, não vai ter nada de significativo nesse plano.

Só que surgiu uma novidade muito importante. No discurso do Lula na Assembleia Geral da ONU, ele fez a sugestão de que nós fizéssemos um balanço ético global.

A ideia é que o balanço seja feito a partir do momento em que todos os países apresentem os seus compromissos nacionalmente determinados, os NDCs, e pouquíssimos países, por enquanto, apresentaram. Vai ficar muito em cima da COP, em novembro, que se terá esse conjunto e se poderá começar a fazer esse balanço.

Não vai ser exatamente na COP, mas, com isso, a COP poderá ter desencadeado na sociedade civil uma dinâmica que ainda não existe: a sociedade civil mundial se mobilizar em torno desse balanço ético global e isso gerar uma forma de maior responsabilização e pressão sobre o conjunto dos países. Se eu não estiver muito enganado, vai acontecer algo de muito positivo, mas meio que fora da COP em si, que virou uma espécie de feira anual de lobistas.

Não vai ser muito diferente desta vez —e com conflitos. Tem tanta gente na Amazônia e tantas tendências da sociedade civil muito mobilizadas em torno disso que é provável que seja, de todas as 30, a mais difícil de conduzir.

Crescer decrescendo

Considero essa ideia uma espécie de ovo de Colombo, porque fica um debate entre os decrescentistas e aqueles que dizem: “Olha a fórmula que funcionou até hoje. Você terá população em queda, educação e inovações institucionais e tecnológicas continuando. Os problemas ambientais meio que se resolvem pelos preços. Não tem que ficar discutindo se tem que ter ou não crescimento. Quanto mais crescer, melhor”.

No entanto, quando você para para pensar em termos práticos, tem coisas que não podem mais crescer e tem outras que são promissoras e que precisam ter espaço para crescer. Não se trata de dizer, para quem está com a responsabilidade da política econômica, que deva pisar no acelerador ou no freio do crescimento. Ao contrário.

Tudo o que emite e queima energia fóssil demais, o ideal é que decresça. As energias renováveis precisam crescer. Estou falando do terreno da energia, mas você pode encontrar exemplos em todos os terrenos. É permanente esse caminho do meio.

Tem uma ideia que eu procuro ressaltar no fim do livro: o fundamental é desacoplar. Este é o verbo-chave da mensagem que a gente pode tirar de uma análise sobre o Antropoceno. Desacoplar, fundamentalmente, significa que tenho que procurar ao máximo possível estimular as atividades que usem menos energias fósseis e que, portanto, emitam menos. Não é o único desacoplamento, mas é o principal.

A viabilidade política da ideia

Na conjuntura atual, diria que é uma inviabilidade. O principal sinal disso é o Trump, mas não está sendo assim na China e a União Europeia está na vanguarda. Para construir a ideia, não posso condicioná-lo ao fato de a agenda ser ou não ser realista.

Se não for, eu estiver errado e essa conjuntura extremamente negativa perdurar, pior para vocês que estarão vivos [risos].

Por que previsões de terremotos falham tanto (BBC/Folha de S.Paulo)

Nas redes sociais, um autoproclamado ‘previsor’ de terremotos diz que consegue prever grandes tremores, mas especialistas afirmam que é pura sorte

Artigo original (Folha de S.Paulo)

26.mar.2025 às 15h22

Ana Faguy, Christal Hayes e Max Matza

BBC News

Brent Dmitruk se autodenomina um “previsor” de terremotos.

Em meados de outubro, ele disse às suas dezenas de milhares de seguidores nas redes sociais que um terremoto atingiria em breve o ponto mais ocidental da Califórnia, ao sul da pequena cidade costeira de Eureka, nos EUA.

Dois meses depois, um tremor de magnitude 7,3 atingiu o local ao norte da Califórnia–colocando milhões de pessoas sob alerta de tsunami, e aumentando o número de seguidores de Dmitruk, que confiaram nele para prever o próximo abalo sísmico.

“Então, para as pessoas que menosprezam o que eu faço: como vocês podem argumentar que é apenas uma coincidência? É preciso ter muita habilidade para descobrir para onde os terremotos vão”, afirmou ele na véspera do Ano Novo.

A imagem mostra uma torre de alto-falantes em primeiro plano, com três alto-falantes brancos montados em um suporte. Ao fundo, é visível a Ponte Golden Gate, com suas torres vermelhas e cabos suspensos, sobre um corpo d'água. O céu está claro e azul, e a paisagem é montanhosa.
Por que previsões de terremotos falham tanto – Getty Images via BBC

Mas há um problema: os terremotos não podem ser previstos, dizem os cientistas que estudam o fenômeno.

É exatamente essa imprevisibilidade que os torna tão perturbadores. Milhões de pessoas que vivem na costa oeste da América do Norte temem que o “Big One” (que significa “O Grande”) possa acontecer a qualquer momento, alterando paisagens e inúmeras vidas.

A imagem mostra uma estrutura de ponte parcialmente destruída, com um pilar quebrado e carros estacionados nas proximidades. O cenário é de destruição, com destroços de concreto espalhados pelo chão e uma paisagem árida ao fundo.
O terremoto de Northridge, em Los Angeles, que matou 57 pessoas e feriu milhares de outras, em 1994, foi o abalo sísmico mais mortal nos EUA na memória recente – Getty Images via BBC

Lucy Jones, sismóloga que trabalhou para o Serviço Geológico dos EUA (USGS, na sigla em inglês) por mais de três décadas, e é autora de um livro chamado The Big Ones, concentrou grande parte de sua pesquisa nas probabilidades de terremotos e na melhoria da resiliência para resistir a esses eventos cataclísmicos.

Desde que começou a estudar terremotos, Jones conta que sempre houve pessoas querendo uma resposta para quando o “Big One”–que significa coisas diferentes, em regiões diferentes–vai acontecer, e alegando ter desvendado a questão.

“A necessidade humana de criar um padrão diante do perigo é extremamente forte, é uma resposta humana bastante normal ao medo”, diz ela à BBC. “No entanto, isso não tem nenhum poder de previsão.”

Com cerca de 100 mil terremotos registrados no mundo todo a cada ano, de acordo com o USGS, é compreensível que as pessoas queiram ser avisadas.

A região de Eureka, uma cidade costeira a 434 quilômetros ao norte de San Francisco, onde ocorreu o terremoto de dezembro, registrou mais de 700 terremotos somente no último ano–incluindo mais de 10 apenas na última semana, segundo os dados.

A região, onde Dmitruk adivinhou corretamente que haveria um terremoto, é uma das “áreas sismicamente mais ativas” dos EUA, de acordo com o USGS. Sua volatilidade se deve ao encontro de três placas tectônicas, uma área conhecida como Junção Tripla de Mendocino.

É o movimento das placas em relação umas às outras – seja acima, abaixo ou ao lado – que causa o acúmulo de estresse. Quando a tensão é liberada, pode ocorrer um terremoto.

Adivinhar que um tremor aconteceria aqui é uma aposta fácil, diz Jones, embora um terremoto forte, de magnitude sete, seja bastante raro.

O USGS destaca que houve apenas 11 terremotos deste tipo ou mais fortes desde 1900. Cinco deles, incluindo o que Dmitruk promoveu nas redes sociais, ocorreram na mesma região.

Embora o palpite estivesse correto, Jones afirma à BBC que é improvável que qualquer terremoto– inclusive os maiores, que devastam a sociedade–possa ser previsto com precisão.

Segundo ela, há um conjunto complexo e “dinâmico” de fatores geológicos que levam a um terremoto.

A magnitude de um terremoto é provavelmente formada à medida que o evento está ocorrendo, Jones explica, usando o ato de rasgar um pedaço de papel como analogia: o rasgo vai continuar a menos que haja algo que o interrompa ou retarde–como marcas de água que deixam o papel molhado.

Os cientistas sabem por que ocorre um terremoto – movimentos repentinos ao longo de falhas geológicas–, mas prever este evento é algo que, segundo o USGS, não pode ser feito, e algo que “não esperamos descobrir em um futuro próximo”.

A imagem mostra um cenário de destruição urbana, com edifícios em ruínas e escombros visíveis. No fundo, há prédios parcialmente intactos, enquanto a área em primeiro plano exibe paredes de tijolos danificadas e destroços. A cena é em preto e branco, sugerindo um período histórico anterior.
San Francisco ficou em ruínas após o terremoto de 1906 – Getty Images via BBC

A agência observa que pode calcular a probabilidade de terremotos em uma região específica dentro de um determinado número de anos – mas isso é o mais próximo que eles conseguem chegar.

Os registros geológicos mostram que alguns dos terremotos de maiores proporções, conhecidos como “Big Ones” pelos moradores locais, acontecem com certa regularidade. Sabe-se que a zona de subducção de Cascadia desliza a cada 300 a 500 anos, devastando regularmente a costa noroeste do Pacífico com megatsunamis de 30,5 metros de altura.

A falha de San Andreas, no sul da Califórnia, também é fonte de outro potencial “Big One”, com terremotos devastadores ocorrendo a cada 200 a 300 anos. Especialistas afirmam que o “Big One” pode acontecer a qualquer momento em qualquer uma das regiões.

Jones conta que, ao longo de sua carreira, milhares de pessoas a alertaram com previsões de um grande terremoto–inclusive indivíduos na década de 1990, que enviavam faxes para seu escritório na esperança de fazer um alerta.

“Quando você recebe uma previsão toda semana, alguém vai ter sorte, certo?”, diz ela rindo. “Mas isso geralmente subia à cabeça deles, e eles faziam mais 10 previsões que não estavam certas.”

Esta situação parece ter acontecido com Dmitruk, que não tem formação científica. Há muito tempo ele prevê que um terremoto incrivelmente grande atingiria o sudoeste do Alasca, o Japão ou as ilhas da costa da Nova Zelândia, com uma magnitude tão forte que, segundo ele, poderia interromper o comércio global.

O USGS afirma que uma previsão de terremoto precisa ter três elementos definidos – uma data e hora, o local e a magnitude do tremor – para ser útil.

Mas o cronograma de Dmitruk continua mudando.

Em um determinado momento, ele disse que o terremoto ocorreria imediatamente antes ou depois da posse do presidente dos EUA, Donald Trump.

Depois, ele anunciou que aconteceria, sem dúvida, antes de 2030.

Embora esse terremoto de grandes proporções ainda não tenha ocorrido, Dmitruk afirma que ainda acredita que vai acontecer.

“Não acredito que seja apenas por acaso”, diz Dmitruk à BBC. “Não é aleatório ou sorte.”

Este tipo de pensamento é comum quando se trata de terremotos, de acordo com Jones.

“Distribuições aleatórias podem parecer ter padrões, vemos constelações nas estrelas”, ela observa.

“Muita gente tem muito medo de terremoto, e a maneira de lidar com isso é prever [quando] eles vão acontecer.”

Como você pode se preparar diante da incerteza de um terremoto

No entanto, o fato de não ser possível prever quando vai acontecer um terremoto, não significa que você deva estar despreparado, segundo especialistas.

Todos os anos, na terceira quinta-feira de outubro, milhões de americanos participam da maior simulação de terremoto do planeta: The Great Shake Out, que pode ser traduzida como “a grande sacudida”.

O exercício foi criado por um grupo do Centro de Terremotos do Sul da Califórnia, que incluía Jones.

Durante a simulação, as pessoas praticam a orientação de “se abaixar, se cobrir e aguardar”: elas se ajoelham, se protegem sob um objeto resistente, como uma mesa, e se mantêm assim por um minuto.

O exercício se tornou tão popular desde sua criação que se espalhou pela costa propensa a terremotos para outros Estados e países.

Se estiverem ao ar livre, as pessoas são aconselhadas a ir para um espaço aberto longe de árvores, edifícios ou linhas de transmissão de energia. Perto do oceano, os moradores praticam fugir para terrenos mais altos depois que o tremor cessa, para se preparar para a possibilidade de um tsunami.

“Agora, enquanto o solo não está tremendo, enquanto não é uma situação muito estressante, é realmente o melhor momento para praticar”, afirma Brian Terbush, gerente do Programa de Terremotos e Vulcões da Divisão de Gerenciamento de Emergências do Estado de Washington, nos EUA.

Além das simulações, os moradores dos Estados da Costa Oeste americana usam um sistema de alerta telefônico mantido pelo USGS, chamado ShakeAlert.

O sistema funciona por meio da detecção de ondas de pressão emitidas por um terremoto. Embora não possa prever quando um terremoto vai ocorrer em um futuro distante, ele fornece um alerta com segundos de antecedência que podem salvar vidas. É a coisa mais próxima de um “previsor” de terremotos que foi inventada até agora.

Este texto foi publicado inicialmente aqui.

Médiuns têm alterações genéticas, mostra estudo coordenado pela USP (Folha de S.Paulo)

www1.folha.uol.com.br

Pesquisa comparou pessoas identificadas com o dom com parentes de primeiro grau sem nenhuma habilidade do tipo

Anna Virginia Balloussier

18 de fevereiro de 2025


Ser médium não é necessariamente coisa do outro mundo. Pode estar nos genes, inclusive.

É o que sustenta um estudo que investiga as bases genéticas da mediunidade, publicado pelo Brazilian Journal of Psychiatry, revista científica em que os artigos são revisados por pares acadêmicos. A coordenação ficou a cargo de Wagner Farid Gattaz, professor do Instituto de Psiquiatria do Hospital das Clínicas da FMUSP (Faculdade de Medicina da Universidade de São Paulo) e à frente do Laboratório de Neurociências na universidade.

A pesquisa de campo, realizada entre 2020 e 2021, comparou 54 pessoas identificadas como médiuns com 53 parentes de primeiro grau delas, sem nenhuma habilidade do tipo. Umbanda e espiritismo foram as principais fontes religiosas do grupo.

“O estudo desvendou alguns genes que estão presentes em médiuns, mas não em pessoas que não o são e têm o mesmo background cultural, nutritivo e religioso”, diz Gattaz. “Isso significa que alguns desses genes poderiam estar ligados ao dom da mediunidade.”

A seleção dos participantes seguiu os seguintes critérios: recrutar médiuns reconhecidos pelo grau de acerto de suas predições, que praticavam pelo menos uma vez por semana a mediunidade e que não ganhavam dinheiro com ela, ou seja, não cobravam por consultas.

Os resultados revelaram quase 16 mil variantes genéticas encontradas exclusivamente neles, “que provavelmente impactam a função de 7.269 genes”. Conclui o texto publicado: “Esses genes surgem como possíveis candidatos para futuras investigações das bases biológicas que permitem experiências espirituais como a mediunidade”.

“Esses genes estão em grande parte ligados ao sistema imune e inflamatório. Um deles, de maneira interessante, está ligado à glândula pineal, que foi tida por muitos filósofos e pesquisadores do passado como a glândula responsável pela conexão entre o cérebro e a mente”, afirma o coordenador da pesquisa. Ele frisa, contudo, que essa hipótese precisaria ser confirmada experimentalmente.

Coautor do estudo e diretor do Nupes (Núcleo de Pesquisas em Espiritualidade e Saúde), da Universidade Federal de Juiz de Fora, Alexander Moreira-Almeida justifica a opção por contrastar os médiuns com seus familiares. “Se eu pegasse um grupo de controle que fosse uma outra pessoa qualquer, aleatória, poderia ter muita diferença sociocultural, econômica e também da própria genética. Quando a gente pega um parente, vai ter uma genética muito mais parecida e um background sociocultural muito mais próximo.”

Os pesquisadores analisaram o exoma dos voluntários, que contém os genes que vão codificar as proteínas. Muitas partes do genoma, que é a sequência completa do DNA, não têm função muito clara. “Já o exoma é aquela que provavelmente é mais ativa funcionalmente, com maior impacto sobre a formação do corpo da pessoa”, explica Almeida-Moreira.

Darcy Neves Moreira, 82, serviu de objeto de estudo para a trupe da ciência. Ela é professora aposentada e coordena reuniões num centro espírita na zona norte carioca.

Descreve o dom que lhe atribuem como a capacidade de “sentir a influência dos espíritos”. Tinha 18 anos quando detectou a sua, conta. “Senti uma presença do meu lado. Comecei a pensar algumas coisas sobre o nosso trabalho. Percebi que não eram propriamente minhas ideias, mas ideias sugeridas pelo amigo que estava ali pertinho de mim.”

São mais de seis décadas “tentando aprimorar esse canal de comunicação com os espíritos, o que me traz muita alegria”, ela afirma. “É a certeza de que continuamos a viver em outro plano.”

Roberto Lúcio Vieira de Souza, 66, também cedeu uma amostra de sua saliva para a pesquisa. Diz que compreendeu seu potencial mediúnico na adolescência, quando apresentava sintomas que os médicos não conseguiam explicar. Tinha câimbras dolorosas durante o sono, “acompanhadas da sensação iminente de morte, o que me desequilibrava emocionalmente”.

Integrava um movimento católico para jovens na época e queria inclusive virar padre. Acabou numa casa de umbanda para entender as agruras físicas. “Lá fui informado sobre a mediunidade e comecei a desenvolvê-la.”

Souza diz que mais tarde descobriu por que se sentia tão mal. Numa vida passada, fora um senhor de escravizados que, irritado com um deles, mandou amarrá-lo no pátio da casa e passar uma carroça sobre suas pernas. Ele foi deixado por dias ali, até morrer.

O próprio espírito do homem assassinado teria lhe contado essa versão. “Ele gritava que queria as suas pernas de volta e que eu deveria morrer com muita dor nas pernas, como ele. Segundo um amigo espiritual, essa era uma atitude comum da minha pessoa naquela encarnação. Eu teria sido um homem muito irascível, orgulhoso e cruel.”

Psiquiatra e autor de livros psicografados, Souza é diretor de uma instituição espírita em Belo Horizonte que se diz especializada em saúde mental e dependência química.

Gattaz, no comando da turma que esquadrinhou sua genética, afirma que não é preciso ter um determinado combo de genes para ser um médium. “O nosso estudo mostra apenas que alguns desses genes são candidatos para serem estudados em novas pesquisas e podem contribuir para o desenvolvimento do dom da mediunidade.”

Ele já havia liderado outro front do estudo, que apura a saúde mental nos ditos médiuns. Saldo: eles não se diferenciam de seus parentes na prevalência de transtornos mentais. Não apresentaram, por exemplo, sintomas de quadros psicóticos, como desorganização cognitiva ou paranoia.

Pontuaram mais, contudo, nos itens que avaliaram alterações na sensopercepção —como ver ou ouvir o que outros não percebem.

Scientists have a new explanation for the last two years of record heat (Washington Post)

washingtonpost.com

Shannon Osaka

Feb 16, 2025


For the past few years, scientists have watched, aghast, as global temperatures have surged — with both 2023 and 2024 reachingaround 1.5 degrees Celsius above the preindustrial average. In some ways, that record heat was expected: Scientists predicted that El Niño, combined with decreasing air pollution that cools the earth, would cause temperatures to skyrocket.

But even those factors, scientists say, are not sufficient to explain the world’s recent record heat.

Earth’s overall energy imbalance — the amount of heat the planet is taking in minus the amount of heat it is releasing — also continues to rise, worrying scientists. The energy imbalance drives global warming. If it rises, scientists expect global temperatures to follow.

Two new studies offer a potential explanation: fewer clouds. And the decline in cloud cover, researchers say, could signal the start of a feedback loop that leads to more warming.

“We have added a new piece to the puzzle of where we are headed,” Helge Goessling, a climate physicist at the Alfred Wegener Institute in Germany and the author of one of the studies,saidin a video interview.

For years, scientists have struggled to incorporate clouds’ influence into the large-scale climate models that help them predict the planet’s future. Clouds can affect the climate system in two ways: First, their white surfaces reflect the sun’s light, cooling the planet. But clouds also act as a kind of blanket, reflecting infrared radiation back to the surface of the planet, just like greenhouse gases.

Which factor wins out depends on the type of cloud and its altitude. High, thin cirrus clouds tend to have more of a warming effect on the planet. Low, fluffy cumulus clouds have more of a cooling effect.

“Clouds are a huge lever on the climate system,” said Andrew Gettelman, an affiliate scientist at the University of Colorado at Boulder. “A small change in clouds could be a large change in how we warm the planet.”

Researchers are beginning to pinpoint how clouds are changing as the world warms. In Goessling’s study, published in December in the journal Science, researchers analyzed how clouds have changed over the past decade. They found that low-altitude cloud cover has fallen dramatically — which has also reduced the reflectivity of the planet. The year 2023 — which was 1.48 degrees Celsius above the preindustrial average — had the lowest albedo since 1940.

In short, the Earth is getting darker.

That low albedo, Goessling and his co-authors calculated, contributed 0.2 degrees Celsius of warming to 2023’s record-high temperatures — an amount roughly equivalent to the warming that has so far been unexplained. “This number of about 0.2 degrees fairly well fits this ‘missing warming,’” Goessling said.

Researchers are still unsure exactly what accounts for this decrease. Some believe that it could be due to less air pollution: When particulates are in the air, it can make it easier for water droplets to stick to them and form clouds.

Another possibility, Goessling said, is a feedback loop from warming temperatures. Clouds require moisture to form, and moist stratocumulus clouds sit just underneath a dry layer of air about one mile high. If temperatures warm, hot air from below can disturb that dry layer, mixing with it and making it harder for wet clouds to form.

But those changes are difficult to predict — and not all climate models show the same changes. “It’s really tricky,” Goessling said.

Other scientists have also found decliningcloud cover. In a preprint study presented at a science conference in December, a group of researchers at NASA found that some of the Earth’s cloudiest zones have been shrinking over the past two decades. Three areas of clouds — one that stretches around the Earth’s equator, and two around the stormy midlatitude zones in the Northern and Southern Hemispheres — have narrowed since 2000, decreasing the reflectivity of the Earth and warming the planet.

George Tselioudis, a climate scientist at NASA’s Goddard Institute for Space Studies and the lead author of the preprint, said this decrease in cloud cover can help explain why the Earth’s energy imbalance has been growing over the past two decades. Overall, the cloud cover in these regions is shrinking by about 1.5 percent per decade, he said, warming the Earth.

Tselioudis said that warming could be constraining these cloud-heavy regions — thus heatingthe planet.“We’ve always understood that the cloud feedback is positive — and it very well could be strong,” he said. “This seems to explain a big part of why clouds are changing the way they are.”

If the cloud changes are part of a feedback loop, scientists warn, that could indicate more warming coming, with extreme heat for billions of people around the globe. Every hot year buttresses the idea that some researchers have now embraced, that global temperature rise will reach the high end of what models had predicted. If so, the planet could pass 1.5 degrees Celsius later this decade.

Researchers now say that they are rushing to understand these effects as the planet continues to warm. “We are kind of in crunch time,” Goessling said. “We have a really strong climate signal — and from year to year it’s getting stronger.”

Gestora ambiental de Roraima recebe prêmio de ‘Cientista Indígena do Brasil’ por atuação sobre crise climática (G1)

Sineia Bezerra do Vale, indígena do povo Wapichana, atua há ao menos três décadas com discussões sobre a emergência do clima e defende que cientistas incluam as experiências dos povos tradicionais nos estudos sobre o assunto.

Por Valéria Oliveira, g1 RR — Boa Vista

27/05/2024 06h01  Atualizado há 4 meses

Sineia Bezerra do Vale, lidernaça indígena do povo Wapichana, ao receber o prêmio "Cientista indígena do Brasil", em São Paulo — Foto: Patricia Zuppi/Rede RCA/Cristiane Júlião/Divulvação

Sineia Bezerra do Vale, lidernaça indígena do povo Wapichana, ao receber o prêmio “Cientista indígena do Brasil”, em São Paulo — Foto: Patricia Zuppi/Rede RCA/Cristiane Júlião/Divulvação

Referência em Roraima por estudos sobre a crise climática em comunidades indígenas, a gestora ambiental Sineia Bezerra do Vale agora também é “cientista indígena do Brasil” reconhecida pelo Planetary Guardians, iniciativa que discute a emergência do clima em todo o mundo e tem como foco restaurar a estabilidade da Terra.

Indígena do povo Wapichana, Sineia do Vale recebeu o título no último dia 25 em São Paulo, no mesmo evento em que o cientista brasileiro Carlos Nobre, referência global nos efeitos das mudanças climáticas na Amazônia, foi anunciado com novo membro dos Planetary Guardians – guardiões planetários, em português.

Sineia do Vale tem como principal atuação o foco sobre a crise do clima, que impacta em consequências devastadoras em todo o mundo. Foi dela o primeiro estudo ambiental sobre as transformações do clima ao longo dos anos na vida dos povos tradicionais em Roraima.

Ao receber o prêmio de “cientista indígena do Brasil” das mãos de Carlos Nobre, a defensora ambiental destacou que quando se trata da crise climática, a ciência também precisa levar em conta a experiência de vida que os indígenas vivenciam no dia a dia – discurso que ela sempre defende nos debates sobre o assunto.

“Esse é um momento muito importante para os povos indígenas. Neste momento em que a gente se coloca junto com a ciência que chamamos de ciência universal, a ciência indígena tem uma importância tanto quanto a que os cientistas traduzem para nós, principalmente na questão do clima”, disse Sineia do Vale.

Sineia do Vale (terceira mulher da direira para a esquerda) atua há anos com foco na crise climática e os povos indígenas — Foto: Patricia Zuppi/Rede RCA/Cristiane Júlião/Divulvação

Sineia do Vale (terceira mulher da direira para a esquerda) atua há anos com foco na crise climática e os povos indígenas — Foto: Patricia Zuppi/Rede RCA/Cristiane Júlião/Divulvação

O estudo inédito comandado por Sineia foi o “Amazad Pana’ Adinham: percepção das comunidades indígenas sobre as mudanças climáticas“, relacionado à região da Serra da Lua, em Roraima. A publicação é considerada referência mundial quando se trata da emergência climática e povos tradicionais.

No evento em São Paulo, ela exemplificou como a crise climática é percebida nas comunidades. “Os indígenas já colocaram em seus planos de enfrentamento às mudanças climáticas que as águas já aqueceram, que os peixes já sumiram e que não estamos mais vivendo o período de adaptação, mas o de crise climática.”

“Precisamos de resposta rápidas. Não podemos mais deixar que os países não cumpram seus acordos porque à medida que o globo terrestre vai aquecendo, os povos indígenas sofrem nas suas terras com grandes catástrofes ambientais”, destacou a gestora.

A indicação para que Sineia recebesse o título ocorreu após indicação da ativista ambiental e geógrafa Hindou Oumarou, que é co-presidente do Fórum Internacional de Povos Indígenas sobre Mudanças do Clima e presidente do Fórum Permanente da ONU sobre questões indígenas chadiana.

Além da roraimense, também receberam a honraria de “cientista indígena do Brasil”: as antropólogas indígenas Braulina Baniwa e Cristiane Julião, do povo Pankararu, confundadoras da Articulação Nacional das Mulheres Guerreiras da Ancestralidade (Anmiga), e o antropólogo e escritor Francisco Apurinã, que pesquisa mudanças ecológicas na perspectiva indígena pela Universidade de Helsinki, na Finlândia.

Mais sobre Sineia do Vale

Sineia do Vale participa desde 2011 da Conferência das Nações Unidas sobre as Mudanças Climáticas – COP, em inglês, e promove junto às lideranças indígenas a avaliação climática a partir do conhecimento ancestral.

Ela também participa ativamente das discussões internacionais sobre mudanças climáticas há mais de 20 anos, entre elas, a Conferência de Bonn sobre Mudanças Climáticas – chamada de SB60, que ocorre todos os anos em Bonn, na Alemanha. Este ano, a COP29 ocorrerá de 11 a 24 de novembro em Baku, capital do Azerbaijão.

Em 2021, Sineia foi a única brasileira a participar da Cúpula dos Líderes sobre o Clima, evento convocado pelo então presidente estadunidense Joe Biden e que marcou a volta dos EUA nas discussões internacionais sobre o clima.

No ano passado, ela foi recebeu o “Troféu Romy – Mulheres do Ano“, honraria concedida a mulheres que se destacaram em suas áreas de atuação em 2023.

Gestora ambiental de formação, Sineia cursa mestrado em Sustentabilidade junto a Povos e Territórios Tradicionais na Universidade de Brasília (UnB), coordena o Departamento de Gestão Territorial e Ambiental do Conselho Indígena de Roraima (CIR), e integra a Convenção-Quadro das Nações Unidas sobre a Mudança do Clima (UNFCCC), focada na agenda indígena e a implementação de ações em nível local.

Superfreak pivot: When climate engineering came to South Africa (Daily Maverick)

Our Burning Planet

Superfreak pivot: When climate engineering came to South Africa

 Illustrative image. Photo by Andy Hutchinson on Unsplash

By Kevin Bloom

22 Jan 2019 

Cooling the earth by blocking out the sun, although potentially disastrous, is now a real answer to climate change. As a Harvard research paper published late last year proved, solar geo-engineering is both technically feasible and relatively cheap. With governments and international bodies considering the technology, a South African university has just announced a study. But how convenient is this answer for our politicians and heavy emitters?

I.Global Hollywood

In his book The Planet Remade: How Geo-engineering Could Change the World, Oliver Morton laid down a potential scenario from the not-too-distant future. As briefings editor at The Economist and former chief news and features editor at the scientific journal Nature, it was a given that this scenario—a thought experiment on the deployment into the stratosphere of “climate engineering” aerosols—would be based more in science fact than science fiction. Which is exactly what made it, like the best work of Robert Heinlein or Charlie Brooker, truly terrifying.

According to a Harvard study published in November 2018, three years after the release of Morton’s book, it would work in practice like this: a fleet of purpose-built aircraft, with disproportionately large wings relative to their fuselages, so as to allow “level flight at an altitude of 20 kilometres while carrying a 25-ton payload,” would inject 0.2 million tons of sulphur dioxide into the lower stratosphere per year—thereby reflecting enough solar radiation back out into space to cut the rate of global warming progressively in half. Pre-launch costs in 2018 values would come in at $3.5 billion, with yearly operating costs at $2.25 billion. Given that in 2017 around 50 nations had military budgets of $3 billion or more, noted the Harvard scientists, the barriers to entry would be remarkably low.

“It is not a large nation that does it—indeed, it is not a single nation’s action at all,” speculated Morton back in 2015. “Sometime in the 2020s, there is a small group of them, two of which are in a position to host the runways. They call themselves the Concert; once they go public, others call them the Affront. None of them is a rich nation, but nor are they among the least developed. All of them already have low carbon-dioxide emissions, and all of them are on pathways to no emissions at all. In climate terms, they look like the good guys. But their low emissions and the esteem of the environmentally conscious part of the international community are doing nothing to reduce the climate-related risks their citizens face.”

So why “truly terrifying”?

Because, as Morton went on to explain, solar geo-engineering—otherwise known as solar radiation management, or SRM—was not (or at least was no longer) a conceptual absurdity. When he wrote his book, its probability of deployment was already based on two of the most urgent existential questions in the history of humanity: 1) Are the risks of climate change great enough to warrant serious action aimed at mitigating them? 2) Will the world’s largest industrial economies be able to lower their carbon emissions to net zero by the middle of the century?

But terrifying more specifically because, by 2018, the answer to the first question was a scientifically unqualified “yes” and to the second a statistically implausible “no”—and yet the effect of SRM on the biosphere was still unknown. With the results from the Harvard study leading to the scheduling of tests as early as the first half of 2019, the Berlin-based climate science and policy institute Climate Analytics wasted no time in recommending a global ban on the technology.

“Solar radiation management aims at limiting temperature increase by deflecting sunlight, mostly through injection of particles into the atmosphere,” the institute noted. “At best, SRM would mask warming temporarily, but more fundamentally is itself a potentially dangerous interference with the climate system.”

SRM, argued the scientists at Climate Analytics, would “alter the global hydrological cycle as well as fundamentally affect global circulation patterns such as monsoons.” It would not “halt, reverse or address in any other way the profound and dangerous problem of ocean acidification which threatens coral reefs and marine life as it does not reduce CO2 emissions and hence influence atmospheric C02 concentration.” Also, the scientists pointed out, the approach was “unlikely to attenuate the effects of global warming on global agricultural production” as its “potentially positive effect due to cooling” was projected to be counterbalanced by “negative effects on crop production of reducing solar radiation at the earth’s surface.”

In other words, according to Climate Analytics, while cooler temperatures would be helpful to the world’s farmers, the crops would still need sunlight to grow. And none of the above even counted as the number one reason that the institute was raising the alarm—SRM’s gravest danger, these scientists and policy experts insisted, was that it would divert attention from the core problem, which remained the unprecedented amount of carbon being spewed daily into the atmosphere by the extraction of coal, crude oil and natural gas.

For Morton, this was the predicament known as the “superfreak pivot”—the turning of large masses of humanity from the position that “global warming requires no emissions reduction because it isn’t a real problem” to the position that “the Concert has it all covered”. It was a predicament highlighted too by Harvard scientist David Keith, who told the Guardian in 2017:

“One of the main concerns I and everyone involved in this have is that Trump might tweet ‘geoengineering solves everything—we don’t have to bother about emissions.’ That would break the slow-moving agreement among many environmental groups that sound research in this field makes sense.”

As for South Africa, less than two months after publication of the seminal Harvard paper of late 2018, a press release was issued by the African Climate and Development Initiative of the University of Cape Town.

“UCT researchers to embark on pioneering study on potential impacts of solar geoengineering in southern Africa,” it stated.

II. Local Hollywood

As the recipient of a grant from the international DECIMALS Fund (Developing Country Impacts Modelling Analysis for SRM), the UCT team cited two reasons for going ahead with the study—and both of them had to do with the social and economic havoc that anthropogenic climate change had so far wrought in our corner of the world. First, the 2015/16 summer rainfall failure over southern Africa, which led to 30 million people becoming food insecure in South Africa, Mozambique, Botswana and Zimbabwe. Second, Cape Town almost running out of water in 2018. If SRM could be done in a safe and reliable manner, so the rationale went, it was “the only known way” to quickly offset the temperature increases that were behind the droughts.

“We want to understand the impact of solar radiation management on drought conditions,” Dr Romaric Odoulami, the project’s leader, told Daily Maverick, “that’s our motivation. What will the implications be for regional agriculture? But I want to make one thing clear: SRM has never been implemented in the real world… and we are not going to do it either.”

What the African Climate and Development Initiative was going to do, said Odoulami, was climate modelling. The project, he added, would run for the next “one or two years”—as soon as he got “something interesting,” he promised, he would let Daily Maverick know. For the moment, he wanted to leave us with this:

“Solar radiation management doesn’t stop climate change. It doesn’t stop global emissions of greenhouse gases. The only thing it does is help to reduce the global temperature by reducing the amount of solar radiation reaching the earth’s surface.”

This caution in the face of the sheer unprecedented scale of the thing was also detectable in the words of Andy Parker, project director of the Solar Radiation Management Governance Initiative, the UK-based organisation—founded in 2010 by, among others, The World Academy of Sciences and the Royal Society—that set up the DECIMALS Fund in 2018. Speaking to Daily Maverick from a conference in Bangladesh, Parker was vague yet morbidly fascinating on the legislative context that could eventually give the green light to SRM.

“That’s really tricky to predict,” he said. “We can imagine various different deployment scenarios. There’s the desperation scenario, where a country or perhaps a coalition of countries that are really suffering from climate change decide that they are going to use solar geo-engineering to stop the temperature from rising. That could be seen as unilateral and illegitimate deployment. At the other end of things, it’s possible that through the United Nations—the UN General Assembly or one of the UN conventions—there’s a much broader coalition that comes together with much more legitimacy to develop a decision-making infrastructure for if we were to ever use this, or indeed, for how we would reject it.

“Really, at this stage, we don’t know what’s going to happen. We don’t know what’s going to happen with the research, we don’t know how governments are going to deal with this, and we don’t know how quickly and how deeply the impacts of climate change are going to bite.”

In South Africa, unfortunately, all indications are that the bite is going to be serious. As Daily Maverick learned from the country’s leading land-based climate scientist in October last year, we are warming at twice the global average. At 3°C of global warming, which is 6°C regionally—and which at current emission rates we are steaming towards, as per the most conservative estimates, before the end of the century—there will be a total collapse of the maize crop and livestock industry. This is something that the Department of Environmental Affairs seems to understand well, as evidenced by their “Third National Communication” under the United Nations Framework Convention on Climate Change, submitted in March 2018.

But the other unknown factor in this general SRM universe of “unknown unknowns” is the person that currently sits atop the DEA. Has Nomvula Mokonyane, who was named at the State Capture inquiry on Monday for allegedly accepting bribes in the form of monthly cash payments, even read the Third National Communication? Does President Cyril Ramaphosa plan on replacing her with someone who will? Aside from Tito Mboweni at treasury, does anyone in the upper echelons of the ANC get the urgency of the situation?

These are the questions that highlight the possibility of South Africa one day performing the superfreak pivot. Because it might not only suit the government to defer to technology when the food and water shortages get real, it might also suit Sasol, the coal mining companies and the country’s heavy emitters at large. DM

Spray and Pray – The risky business of geoengineering Africa’s climate (Daily Maverick)

CLIMATE GEOENGINEERING

Spray and Pray – The risky business of geoengineering Africa’s climate

 Solar Radiation Modification refers to deliberate, large-scale interventions in the global climate system to increase the amount of sunlight reflected away from the planet to reduce global temperatures. Illustrative image: (Generated with Flux AI)

By Ethan van Diemen

07 Aug 2024 

In a webinar on Tuesday, scientists and other experts agreed on the need for solar geoengineering research to enhance the portfolio of climate change responses.

Solar geoengineering, whether through space mirrors or stratospheric particles, is a complex, controversial and contentious field. In a webinar on Tuesday, atmospheric scientists and other experts from across Africa agreed that it is completely rational to explore its role in a portfolio of climate change responses. 

Geoengineering refers to deliberate, large-scale interventions in the Earth’s natural systems with the aim of counteracting climate change. The primary goal of geoengineering is to mitigate the adverse effects of global warming and manage the Earth’s climate system. There are two main categories of geoengineering: Solar Radiation Management (SRM) and Carbon Dioxide Removal (CDR).

The webinar focused on the former, which The Alliance for Just Deliberation on Solar Geoengineering says refers to “deliberate, large-scale interventions in the global climate system to increase the amount of sunlight reflected away from the planet to reduce global temperatures”.

Read more: Superfreak pivot: When climate engineering came to South Africa

The Intergovernmental Panel on Climate Change (IPCC) in its Sixth Assessment Report defines SRM as “a range of radiation modification measures not related to greenhouse gas mitigation that seek to limit global warming. Most methods involve reducing the amount of incoming solar radiation reaching the surface, but others also act on the longwave radiation budget by reducing optical thickness and cloud lifetime.”

geoengineering africa climate

(Source: The Alliance for Just Deliberation on Solar Geoengineering)

Hassaan Sipra, director of global engagement at The Alliance and a climate researcher, explained that SRM – in line with conclusions by the IPCC – is not meant to stop climate change but only to buy time for the deep reductions in greenhouse gas emissions needed to limit global warming. He also set out the context wherein SRM was an increasingly attractive area of research. 

During the UN climate conference in Paris, the world agreed to accelerate efforts to limit the global average temperature increase over pre-industrial levels to below 1.5°C. At present, we are on a trajectory to exceed even 2°C. This is important because every fraction of a degree drastically increases the risks associated with anthropogenic climate change. 

“And typically, now, within this context,” said Sipra, “what is being talked about is the use of carbon dioxide removal technologies. So we know that we’re not going to get to net zero emissions until about 2100 if we’re looking for 1.5°C. If it’s 2°C, we’re not going to get there until after 2100. So in the meantime, we also need to start scaling up our carbon dioxide removal technologies so that whatever carbon is in the atmosphere, we are immediately able to capture it and bring that back.”

Put differently, carbon removal will still be necessary in the future because even with significant reductions in greenhouse gas emissions, existing atmospheric carbon levels must be reduced to meet net zero targets and stabilise global temperatures, as outlined in the Paris Agreement. This ensures long-term climate goals are achievable by offsetting any remaining emissions.

Sipra explained that the problem with carbon dioxide removal was the interrelated problems of cost and scale. 

It’s “an expensive technology or a set of technologies that would take a long time to scale up and would require a tremendous amount of resources, and at present, those resources are not yet scalable… they’re not yet available, the technologies are not yet fully tested, and so we need a lot of time before we’re going to get to carbon dioxide removal technologies.

“We need time to cut emissions and we need time to get to carbon dioxide removal technologies. Yet climate impacts are continuing to rise in the meantime. And this is the point where for scientists, policymakers, civil society, the deliberation has begun as to what might be the possibility of buying some additional time; putting in a potential stopgap measure.”

geoengineering africa climate

Napkin diagram roughly showing SRM’s role in managing climate risks.(This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.)

SRM is a “stopgap measure”, Sipra explained, in contrast to emissions reductions or carbon dioxide removal because “it does not actually offer a solution to our climate problems, it merely masks it. And so, without addressing the root cause of climate change, you are kind of just giving yourself this, in essence, a drug which may delay – potentially – some of the impacts of climate change”.

But just how is SRM meant to achieve this? 

Prof Babatunde Abiodun, an expert in climate model developments and applications, shared some details on the state of SRM research and the various approaches being explored and experiments undertaken. Three of the projects he noted are highlighted here:

  • Stratospheric Aerosol Processes, Budget and Radiative Effects (SABRE): SABRE investigates how tiny particles in the stratosphere may reflect sunlight to cool the Earth. The project is “an extended airborne science measurement programme” and aims to understand the effectiveness and potential impacts of these aerosols so as to strengthen the “scientific foundation to inform policy decisions related to regulating global emissions that impact the stratosphere (eg ozone depleting substances, rocket exhaust) and the potential injection of material into the stratosphere to combat global warming (climate intervention)”.
  • Stratospheric Controlled Perturbation Experiment (SCoPex): SCoPex, a Harvard University-led project, explores the feasibility of dispersing reflective particles in the stratosphere to mimic volcanic cooling effects using a high-altitude balloon to release small amounts of aerosols over a small area. However, the project has recently moved away from its focus on science related to solar geoengineering.
  • Geoengineering Assessment Across Uncertainties, Scenarios and Strategies (GAUSS): GAUSS evaluates the potential risks and benefits of various geoengineering methods by using complex computer simulations. Early findings suggest that while geoengineering can reduce global temperatures, it may also lead to regional climate changes, emphasising the need for careful, scenario-based planning. They explain that “one challenge today is a degree of arbitrariness in the scenarios used in current SRM simulations”.

SRM and other geoengineering approaches, however, are not without controversy. The main concerns are the potential for unintended environmental side effects, ethical issues regarding the manipulation of natural systems and the risk of unequal impacts on different regions potentially exacerbating global inequalities.

The IPCC says in the Summary for Policymakers of its Sixth Assessment Report that, with high confidence, “solar radiation modification approaches, if they were to be implemented, introduce a widespread range of new risks to people and ecosystems, which are not well understood. Solar radiation modification approaches have the potential to offset warming and ameliorate some climate hazards, but substantial residual climate change or overcompensating change would occur at regional scales and seasonal timescales.

“Large uncertainties and knowledge gaps are associated with the potential of solar radiation modification approaches to reduce climate change risks. Solar radiation modification would not stop atmospheric CO₂ concentrations from increasing or reducing resulting ocean acidification under continued anthropogenic emissions.”

To this, the gathered scientists and experts said that while they recognised the potential risks, these should be weighed against the risk of the status quo or inaction.

“It makes sense to think about SRM as a very risky proposition, but it’s a risky proposition that has to be compared to an alternative risky proposition, which is worsening climate change. So, climate change increases risks to peoples and ecosystems. With each ton of carbon dioxide we’re adding into the atmosphere, with each incremental bit of warming, those risks rise exponentially.

“So, just like climate change has its risks, SRM has risks. It also has potential benefits, and it has a large amount of uncertainties, and none of them are well understood. So, in order to make a comparison against climate change with SRM, we need to really have an informed decision-making process around SRM so that we can have a better sense of its benefits and its drawbacks,” said Sipra.

“We need to explore SRM in the context of worsening climate change,” he said, adding that geoengineering would “not be a discussion if the climate situation had been resolved after the Rio summit when they formulated the UN Framework Convention on Climate Change.

The fact that the climate is getting worse; the fact that we are not mitigating, is the reason people are beginning to have a conversation about SRM. So, it can only ever be contextualised in comparison to climate change.” DM

One-quarter of unresponsive people with brain injuries are conscious (Nature)

More people than we thought who are in comas or similar states can hear what is happening around them, a study shows.

14 August 2024

Julian Nowogrodzki

    A coloured CT scan showing a bleed on a patient's brain.
    Blood (red; artificially coloured) pools in the brain of a person who has had a stroke, a common cause of coma.Credit: Zephyr/Science Photo Library

    At least one-quarter of people who have severe brain injuries and cannot respond physically to commands are actually conscious, according to the first international study of its kind1.

    Although these people could not, say, give a thumbs-up when prompted, they nevertheless repeatedly showed brain activity when asked to imagine themselves moving or exercising.

    “This is one of the very big landmark studies” in the field of coma and other consciousness disorders, says Daniel Kondziella, a neurologist at Rigshospitalet, the teaching hospital for Copenhagen University.

    The results mean that a substantial number of people with brain injuries who seem unresponsive can hear things going on around them and might even be able to use brain–computer interfaces (BCIs) to communicate, says study leader Nicholas Schiff, a neurologist at Weill Cornell Medicine in New York City. BCIs are devices implanted into a person’s head that capture brain activity, decode it and translate it into commands that can, for instance, move a computer cursor. “We should be allocating resources to go out and find these people and help them,” Schiff says. The work was published today in The New England Journal of Medicine1.

    Scanning the brain

    The study included 353 people with brain injuries caused by events such as physical trauma, heart attacks or strokes. Of these, 241 could not react to any of a battery of standard bedside tests for responsiveness, including one that asks for a thumbs-up; the other 112 could.

    Everyone enrolled in the study underwent one or both of two types of brain scan. The first was functional magnetic resonance imaging (fMRI), which measures mental activity indirectly by detecting the oxygenation of blood in the brain. The second was electroencephalography (EEG), which uses an electrode-covered cap on a person’s scalp to measure brain-wave activity directly. During each scan, people were told to imagine themselves playing tennis or opening and closing their hand. The commands were repeated continuously for 15–30 seconds, then there was a pause; the exercise was then repeated for six to eight command sessions.

    Of the physically unresponsive people, about 25% showed brain activity across the entire exam for either EEG or fMRI. The medical name for being able to respond mentally but not physically is cognitive motor dissociation. The 112 people in the study who were classified as responsive did a bit better on the brain-activity tests, but not much: only about 38% showed consistent activity. This is probably because the tests set a high bar, Schiff says. “I’ve been in the MRI, and I’ve done this experiment, and it’s hard,” he adds.

    This isn’t the first time a study has found cognitive motor dissociation in people with brain injuries who were physically unresponsive. For instance, in a 2019 paper, 15% of the 104 people undergoing testing displayed this behaviour2. The latest study, however, is larger and is the first multicentre investigation of its kind. Tests were run at six medical facilities in four countries: Belgium, France, the United Kingdom and the United States.

    The 25% of unresponsive people who showed brain activity tended to be younger than those who did not, to have injuries that were from physical trauma and to have been living with their injuries for longer than the others. Kondziella cautions that further investigating these links would require repeat assessments of individuals over weeks or months. “We know very little about consciousness-recovery trajectories over time and across different brain injuries,” he says.

    Room for improvement

    But the study has some limitations. For example, the medical centres did not all use the same number or set of tasks during the EEG or fMRI scans, or the same number of electrodes during EEG sessions, which could skew results.

    In the end, however, with such a high bar set for registering brain activity, the study probably underestimates the proportion of physically unresponsive people who are conscious, Schiff says. Kondziella agrees. Rates of cognitive motor dissociation were highest for people tested with both EEG and fMRI, he points out, so if both methods were used with every person in the study, the overall rates might have been even higher.

    However, the kinds of test used are logistically and computationally challenging, “which is why really only a handful or so of centres worldwide are able to adopt these techniques”, Kondziella says.

    Schiff stresses that it’s important to be able to identify people with brain injuries who seem unresponsive but are conscious. “There are going to be people we can help get out of this condition,” he says, perhaps by using BCIs or other treatments, or simply continuing to provide medical care. Knowing that someone is conscious can change how families and medical teams make decisions about life support and treatment. “It makes a difference every time you find out that somebody is responsive,” he says.

    doi: https://doi.org/10.1038/d41586-024-02614-z

    References

    1. Bodien, Y. G. et al. N. Engl. J. Med. 391, 598–608 (2024).
    2. Claassen, J. et al. N. Engl. J. Med. 380, 2497–2505 (2019).

    Decades-long bet on consciousness ends — and it’s philosopher 1, neuroscientist 0

    Decoding the neuroscience of consciousness

    What does ‘brain dead’ really mean? The battle over how science defines the end of life

    Ciência sozinha não vai resolver crise climática, diz filósofo americano (Folha de S.Paulo)

    www1.folha.uol.com.br

    Maurício Meireles

    26.junho.2024


    Só a ciência, sem a política, não vai conseguir produzir as soluções que o mundo precisa para combater a crise climática.

    Quem diz é Michael Sandel, professor de filosofia política da Universidade Harvard e autor de diversos livros —o último publicado no Brasil é “O Descontentamento da Democracia” (ed. Civilização Brasileira).

    A frase pode até parecer um truísmo, mas resume o cerne das críticas de Sandel às democracias contemporâneas: quando a política é dominada pelo discurso tecnocrático, os cidadãos acabam sem voz e sem meios de participar das decisões que afetam suas vidas.

    Em “O Descontentamento da Democracia”, ele diz que a culpa da ascensão do autoritarismo pelo mundo é do chamado neoliberalismo, que promoveu o fortalecimento do setor financeiro e a desregulação dos mercados.

    Para Sandel, essas políticas foram levadas adiante por tecnocratas de direita e esquerda que alienaram cidadãos das decisões econômicas, amparados por um discurso de meritocracia, gerando uma reação de ressentimento contra as elites e ceticismo quanto à democracia.

    Por isso, diz, é importante não repetir o mesmo erro com a política climática. A ciência é, sim, crucial para embasar as decisões, mas é preciso participação democrática —e as lideranças não podem usar o discurso científico como forma de escapar de suas responsabilidades.

    Sandel participou recentemente do ciclo de palestras online Clima e Sociedade, promovido pela UFSM (Universidade Federal de Santa Maria), em que conversou com lideranças comunitárias que estão no front de combate aos efeitos das enchentes que atingiram o Rio Grande do Sul.

    Em entrevista à Folha, ele explica como os pontos levantados em seu livro se aplicam ao debate climático, avalia a atuação de instituições multilaterais nesse campo e defende que se evite o discurso apocalíptico.

    Em vez de reforçar um sentimento de solidariedade, crises como a pandemia parecem ter explicitado divisões —e democracias como os Estados Unidos continuam tão polarizadas quanto antes. O mundo superestima a capacidade dessas crises de gerar solidariedade mútua?

    A pandemia foi um ótimo teste. No começo da crise de saúde, com frequência ouvíamos que essa emergência iria nos unir, mostrando que somos igualmente vulneráveis, apesar das desigualdades econômicas. Quem podia trabalhar de casa logo percebeu o quanto dependemos do trabalho de pessoas que são ignoradas.

    Poderia ter sido a hora para debater melhores salários e reconhecimento para os trabalhadores essenciais. Mas isso não ocorreu. A pandemia retrocedeu e a gratidão a eles também. Foi uma oportunidade perdida, a pandemia não nos levou a uma transformação social ou espiritual.

    Para manter o senso de comunidade que crises como as enchentes no Sul despertam, é necessário criar instituições e espaços públicos sólidos, além formas de organização. Pressionar os governos por recursos é importante, mas é preciso construir instituições na sociedade civil, estabelecer um diálogo contínuo para que todos entendam que dividimos uma vida comum.

    Você coloca a economia no centro da sua análise da crise das democracias liberais, em detrimento de uma análise mais cultural, que hoje parece mais popular no debate público. Por que essa abordagem econômica é mais adequada?

    Em “O Descontentamento da Democracia”, escrevo sobre o que chamo de economia política da cidadania. Nas últimas décadas, erramos ao pressupor que o único propósito da economia é promover o consumo. E que, portanto, nosso foco principal deveria ser o crescimento do PIB e a distribuição da riqueza.

    Tudo isso importa, é claro, mas não são as únicas questões que devemos levar em conta. É preciso se perguntar quais arranjos econômicos são favoráveis à participação democrática.

    Democracia não é só ir votar. Democracia é criar condições econômicas e sociais que possibilitem às pessoas deliberar como iguais [sobre seu destino], moldando as forças que as governam.

    Quando as pessoas não têm voz, elas se sentem excluídas, raivosas, ressentidas —e vão se conectar a políticos demagogos que canalizam essa alienação. Isso é o que começa a explicar o que está havendo, nos últimos anos, tanto nos EUA quanto no Brasil.

    Sou contra uma separação tão dura entre uma análise econômica e outra cultural sobre por que cidadãos têm apoiado líderes autoritários pelo mundo.

    Nos últimos anos, a divisão entre vencedores e derrotados se aprofundou, envenenando a política. Isso tem em partes a ver com a desigualdade, mas há também uma mudança de atitude em relação ao sucesso individual.

    Quem está no topo acha que os lucros que recebeu do mercado são a medida de seus méritos —e que quem ficou para trás, por consequência, também mereceria o próprio destino.

    Isso ajuda a explicar a política da raiva e do ressentimento. E também o crescimento do populismo autoritário de direita. Muitos da classe trabalhadora acham que as elites os olham de cima para baixo, com especial desprezo contra aqueles que não receberam educação universitária.

    Agora, quanto dessa análise é econômica e quanto é cultural? São ambos os casos. É econômica porque mostra como a economia pode deteriorar a democracia participativa. E porque foca no papel do crescimento da desigualdade. Mas também é cultural, porque identifico esses elementos de ressentimento.

    É preciso atar as duas pontas, resolver as desigualdades e lidar com a sensação de não ter voz, a raiva de tantos da classe trabalhadora.

    Sua análise está centrada na globalização e nas chamadas políticas neoliberais. Qual o impacto disso no debate político sobre a mudança climática?

    A política da mudança climática é um exemplo de como algo focado nas elites rapidamente vira um discurso tecnocrático quanto ao meio ambiente, que pode reforçar a polarização.

    Há um jeito de organizar a economia e conduzir as políticas públicas que leva os cidadãos a se sentir sem voz. É uma política que trata as questões econômicas e ambientais como algo técnico, que só diz respeito aos especialistas.

    Vimos esse discurso tecnocrático durante a pandemia, quando representantes das elites diziam que só estavam “seguindo a ciência”, o que é um jeito de escapar das responsabilidades.

    Claro que é importante seguir a ciência no meio de uma pandemia ou da mudança climática —mas é um erro pressupor que a ciência sozinha pode realizar as avaliações políticas necessárias.

    Por exemplo, durante a pandemia, a ciência não podia resolver se deveríamos fechar as escolas e por quanto tempo. Esse era um julgamento político, que teve que ser debatido pelos cidadãos dentro do processo democrático —e quem tomou as decisões teria que assumir responsabilidade por elas.

    Não adianta só dizer que se está seguindo a ciência. No caso da política climática, a ciência precisa informar as decisões que vamos tomar, mas essas medidas precisam ser debatidas entre as pessoas implicadas nelas. Pois isso envolve negociações, questões distributivas, um debate sobre quem vai pagar o preço da transição para a economia verde… E por aí vai.

    O que nos trouxe a esse momento tão polarizado é a insistência das elites políticas de que são especialistas ou que estão amparadas por especialistas. Com isso, fica implícito que quem discorda ou está mal informado ou é ignorante. E que, portanto, essas pessoas não estariam qualificadas a ter voz.

    É o mesmo que ocorreu com a dignidade da classe trabalhadora, deteriorada pela financeirização da economia, a terceirização do trabalho, tudo em nome dos especialistas que diziam que isso seria bom para todos.

    Se repetimos essa postura tecnocrática no combate à mudança climática, teremos outra vez as elites olhando os cidadãos de cima, dizendo que estão só seguindo a ciência.

    A mudança climática é uma crise que requer ação global. Mas, no seu livro, você mostra ceticismo quanto às organizações multilaterais. Como elas deveriam funcionar?

    A mudança climática requer cooperação global, sem dúvidas. E isso significa que precisamos de instituições multilaterais para conceber e implementar políticas, a fim de proteger o planeta e promover a transição para a economia verde.

    As instituições globais hoje operam como instituições tecnocráticas, sem legitimidade ou participação democrática. E esse é um problema com o qual teremos que lidar.

    A Europa já estava lidando com isso antes de a crise climática se tornar algo tão central. Mesmo esse bloco, onde a maioria dos países têm tradição democrática, tem sido associado aos burocratas de Bruxelas. Cidadãos nacionais se ressentem de determinações vindas desses burocratas e acham que não têm voz.

    O mesmo vale para as instituições que serão necessárias para lidarmos com a crise climática. Precisamos de plataformas para um discurso público global, que possa envolver os cidadãos comuns na formulação de políticas que nos levarão a uma economia verde.

    Se for algo puramente tecnocrático, mesmo o grupo de especialistas melhor administrado não vai ser capaz de conquistar legitimidade democrática e implementar qualquer política. Em resumo, não sou cético quanto às instituições multilaterais, apenas quero enfatizar a necessidade de participação democrática.

    Seu principal ponto é que as políticas neoliberais teriam levado eleitores para a extrema direita. Você aponta que o presidente Joe Biden foi o primeiro a romper com essas diretrizes econômicas. No entanto, Donald Trump tem grandes chances de ser eleito. O que houve?

    Um novo mandato de Trump aumentaria os riscos que a democracia americana enfrenta. Sim, Joe Biden rompeu com o mercado neoliberal que produziu a polarização política.

    Ele falou sobre tentar restaurar a dignidade do trabalho, inclusive para quem não tem formação universitária.

    Biden revitalizou as políticas antitruste, não só com objetivo de diminuir preços ao consumidor, mas também de responsabilizar o poder econômico, especialmente no caso das “big techs”. Também levou o Congresso a implementar investimentos públicos em infraestrutura que não eram vistos há décadas.

    Por que ele não está colhendo dividendos políticos disso? Ele não foi capaz de articular essas medidas em uma nova visão de governo, não conseguiu explicar como todas essas políticas, se vistas em conjunto, podem renovar a cidadania democrática. E que, ao restaurar a dignidade do trabalho, as pessoas podem ter voz na política.

    Uma liderança não depende só de implementar boas políticas, mas de oferecer uma visão que seja ao mesmo tempo econômica, política e moral.


    RAIO-X

    Michael J. Sandel, 71

    É professor de filosofia política na Universidade Harvard, nos EUA, com obras que já foram traduzidas para mais de 30 línguas. Em livros como “O Descontentamento da Democracia” e “A Tirania do Mérito”, escreve sobre ética, economia e democracia, entre outros temas. Seu curso intitulado Justiça foi o primeiro de Harvard a ser disponibilizado gratuitamente online —e já foi visto por dezenas de milhões de pessoas.

    ‘Everybody has a breaking point’: how the climate crisis affects our brains (Guardian)

    Researchers measuring the effect of Hurricane Sandy on children in utero at the time reported: ‘Our findings are extremely alarming.’ Illustration: Ngadi Smart/The Guardian

    Are growing rates of anxiety, depression, ADHD, PTSD, Alzheimer’s and motor neurone disease related to rising temperatures and other extreme environmental changes?

    Original article

    Clayton Page Aldern

    Wed 27 Mar 2024 05.00 GMTShare

    In late October 2012, a category 3 hurricane howled into New York City with a force that would etch its name into the annals of history. Superstorm Sandy transformed the city, inflicting more than $60bn in damage, killing dozens, and forcing 6,500 patients to be evacuated from hospitals and nursing homes. Yet in the case of one cognitive neuroscientist, the storm presented, darkly, an opportunity.

    Yoko Nomura had found herself at the centre of a natural experiment. Prior to the hurricane’s unexpected visit, Nomura – who teaches in the psychology department at Queens College, CUNY, as well as in the psychiatry department of the Icahn School of Medicine at Mount Sinai – had meticulously assembled a research cohort of hundreds of expectant New York mothers. Her investigation, the Stress in Pregnancy study, had aimed since 2009 to explore the potential imprint of prenatal stress on the unborn. Drawing on the evolving field of epigenetics, Nomura had sought to understand the ways in which environmental stressors could spur changes in gene expression, the likes of which were already known to influence the risk of specific childhood neurobehavioural outcomes such as autism, schizophrenia and attention deficit hyperactivity disorder (ADHD).

    The storm, however, lent her research a new, urgent question. A subset of Nomura’s cohort of expectant women had been pregnant during Sandy. She wanted to know if the prenatal stress of living through a hurricane – of experiencing something so uniquely catastrophic – acted differentially on the children these mothers were carrying, relative to those children who were born before or conceived after the storm.

    More than a decade later, she has her answer. The conclusions reveal a startling disparity: children who were in utero during Sandy bear an inordinately high risk of psychiatric conditions today. For example, girls who were exposed to Sandy prenatally experienced a 20-fold increase in anxiety and a 30-fold increase in depression later in life compared with girls who were not exposed. Boys had 60-fold and 20-fold increased risks of ADHD and conduct disorder, respectively. Children expressed symptoms of the conditions as early as preschool.

    A resident pulls a woman in a canoe down 6th Street as high tide, rain and winds flood local streets on October 29, 2012 in Lindenhurst, New York.
    Flooding in Lindenhurst, New York, in October 2012, after Hurricane Sandy struck. Photograph: Bruce Bennett/Getty Images

    “Our findings are extremely alarming,” the researchers wrote in a 2022 study summarising their initial results. It is not the type of sentence one usually finds in the otherwise measured discussion sections of academic papers.

    Yet Nomura and her colleagues’ research also offers a representative page in a new story of the climate crisis: a story that says a changing climate doesn’t just shape the environment in which we live. Rather, the climate crisis spurs visceral and tangible transformations in our very brains. As the world undergoes dramatic environmental shifts, so too does our neurological landscape. Fossil-fuel-induced changes – from rising temperatures to extreme weather to heightened levels of atmospheric carbon dioxide – are altering our brain health, influencing everything from memory and executive function to language, the formation of identity, and even the structure of the brain. The weight of nature is heavy, and it presses inward.

    Evidence comes from a variety of fields. Psychologists and behavioural economists have illustrated the ways in which temperature spikes drive surges in everything from domestic violence to online hate speech. Cognitive neuroscientists have charted the routes by which extreme heat and surging CO2 levels impair decision-making, diminish problem-solving abilities, and short-circuit our capacity to learn. Vectors of brain disease, such as ticks and mosquitoes, are seeing their habitable ranges expand as the world warms. And as researchers like Nomura have shown, you don’t need to go to war to suffer from post-traumatic stress disorder: the violence of a hurricane or wildfire is enough. It appears that, due to epigenetic inheritance, you don’t even need to have been born yet.

    When it comes to the health effects of the climate crisis, says Burcin Ikiz, a neuroscientist at the mental-health philanthropy organisation the Baszucki Group, “we know what happens in the cardiovascular system; we know what happens in the respiratory system; we know what happens in the immune system. But there’s almost nothing on neurology and brain health.” Ikiz, like Nomura, is one of a growing cadre of neuroscientists seeking to connect the dots between environmental and neurological wellness.

    As a cohesive effort, the field – which we might call climatological neuroepidemiology – is in its infancy. But many of the effects catalogued by such researchers feel intuitive.

    Two people trudge along a beach, with the sea behind them, and three folded beach umbrellas standing on the beach. The sky is a dark orange colour and everything in the picture is strongly tinted orange.
    Residents evacuate Evia, Greece, in 2021, after wildfires hit the island. Photograph: Bloomberg/Getty Images

    Perhaps you’ve noticed that when the weather gets a bit muggier, your thinking does the same. That’s no coincidence; it’s a nearly universal phenomenon. During a summer 2016 heatwave in Boston, Harvard epidemiologists showed that college students living in dorms without air conditioning performed standard cognitive tests more slowly than those living with it. In January of this year, Chinese economists noted that students who took mathematics tests on days above 32C looked as if they had lost the equivalent of a quarter of a year of education, relative to test days in the range 22–24C. Researchers estimate that the disparate effects of hot school days – disproportionately felt in poorer school districts without access to air conditioning and home to higher concentrations of non-white students – account for something on the order of 5% of the racial achievement gap in the US.

    Cognitive performance is the tip of the melting iceberg. You may have also noticed, for example, your own feelings of aggression on hotter days. You and everyone else – and animals, too. Black widow spiders tend more quickly toward sibling cannibalism in the heat. Rhesus monkeys start more fights with one another. Baseball pitchers are more likely to intentionally hit batters with their pitches as temperatures rise. US Postal Service workers experience roughly 5% more incidents of harassment and discrimination on days above 32C, relative to temperate days.

    Neuroscientists point to a variety of routes through which extreme heat can act on behaviour. In 2015, for example, Korean researchers found that heat stress triggers inflammation in the hippocampus of mice, a brain region essential for memory storage. Extreme heat also diminishes neuronal communication in zebrafish, a model organism regularly studied by scientists interested in brain function. In human beings, functional connections between brain areas appear more randomised at higher temperatures. In other words, heat limits the degree to which brain activity appears coordinated. On the aggression front, Finnish researchers noted in 2017 that high temperatures appear to suppress serotonin function, more so among people who had committed violent crimes. For these people, blood levels of a serotonin transporter protein, highly correlated with outside temperatures, could account for nearly 40% of the fluctuations in the country’s rate of violent crime.

    Illustration of a person sweating in an extreme heat scenario
    Prolonged exposure to heat can activate a multitude of biochemical pathways associated with Alzheimer’s and Parkinson’s. Illustration: Ngadi Smart/The Guardian

    “We’re not thinking about any of this,” says Ikiz. “We’re not getting our healthcare systems ready. We’re not doing anything in terms of prevention or protections.”

    Ikiz is particularly concerned with the neurodegenerative effects of the climate crisis. In part, that’s because prolonged exposure to heat in its own right – including an increase of a single degree centigrade – can activate a multitude of biochemical pathways associated with neurodegenerative diseases such as Alzheimer’s and Parkinson’s. Air pollution does the same thing. (In rats, such effects are seen after exposure to extreme heat for a mere 15 minutes a day for one week.) Thus, with continued burning of fossil fuels, whether through direct or indirect effects, comes more dementia. Researchers have already illustrated the manners in which dementia-related hospitalisations rise with temperature. Warmer weather worsens the symptoms of neurodegeneration as well.

    Prior to her move to philanthropy, Ikiz’s neuroscience research largely focused on the mechanisms underlying the neurodegenerative disease amyotrophic lateral sclerosis (ALS, also known as Lou Gehrig’s disease or motor neurone disease). Today, she points to research suggesting that blue-green algae, blooming with ever-increasing frequency under a changing global climate, releases a potent neurotoxin that offers one of the most compelling causal explanations for the incidence of non-genetic ALS. Epidemiologists have, for example, identified clusters of ALS cases downwind of freshwater lakes prone to blue-green algae blooms.

    A woman pushing a shopping trolley grabs the last water bottles from a long empty shelf in a supermarket.
    A supermarket in Long Beach is stripped of water bottles in preparation for Hurricane Sandy. Photograph: Mike Stobe/Getty Images

    It’s this flavour of research that worries her the most. Children constitute one of the populations most vulnerable to these risk factors, since such exposures appear to compound cumulatively over one’s life, and neurodegenerative diseases tend to manifest in the later years. “It doesn’t happen acutely,” says Ikiz. “Years pass, and then people get these diseases. That’s actually what really scares me about this whole thing. We are seeing air pollution exposure from wildfires. We’re seeing extreme heat. We’re seeing neurotoxin exposure. We’re in an experiment ourselves, with the brain chronically exposed to multiple toxins.”

    Other scientists who have taken note of these chronic exposures resort to similarly dramatic language as that of Nomura and Ikiz. “Hallmarks of Alzheimer disease are evolving relentlessly in metropolitan Mexico City infants, children and young adults,” is part of the title of a recent paper spearheaded by Dr Lilian Calderón-Garcidueñas, a toxicologist who directs the University of Montana’s environmental neuroprevention laboratory. The researchers investigated the contributions of urban air pollution and ozone to biomarkers of neurodegeneration and found physical hallmarks of Alzheimer’s in 202 of the 203 brains they examined, from residents aged 11 months to 40 years old. “Alzheimer’s disease starting in the brainstem of young children and affecting 99.5% of young urbanites is a serious health crisis,” Calderón-Garcidueñas and her colleagues wrote. Indeed.

    A flooded Scottish street, with cars standing in water, their wheels just breaking the surface. A row of houses in the background with one shop called The Pet Shop.
    Flooding in Stonehaven, Aberdeenshire, in 2020. Photograph: Martin Anderson/PA

    Such neurodevelopmental challenges – the effects of environmental degradation on the developing and infant brain – are particularly large, given the climate prognosis. Rat pups exposed in utero to 40C heat miss brain developmental milestones. Heat exposure during neurodevelopment in zebrafish magnifies the toxic effects of lead exposure. In people, early pregnancy exposure to extreme heat is associated with a higher risk of children developing neuropsychiatric conditions such as schizophrenia and anorexia. It is also probable that the ALS-causing neurotoxin can travel in the air.

    Of course, these exposures only matter if you make it to an age in which neural rot has a chance to manifest. Neurodegenerative disease mostly makes itself known in middle-aged and elderly people. But, on the other hand, the brain-eating amoeba likely to spread as a result of the climate crisis – which is 97% fatal and will kill someone in a week – mostly infects children who swim in lakes. As children do.

    A coordinated effort to fully understand and appreciate the neurological costs of the climate crisis does not yet exist. Ikiz is seeking to rectify this. In spring 2024, she will convene the first meeting of a team of neurologists, neuroscientists and planetary scientists, under the banner of the International Neuro Climate Working Group.

    Mexico City landscape engulfed in smog.
    Smog hits Mexico City. Photograph: E_Rojas/Getty Images/iStockphoto

    The goal of the working group (which, full disclosure, I have been invited to join) is to wrap a collective head around the problem and seek to recommend treatment practices and policy recommendations accordingly, before society finds itself in the midst of overlapping epidemics. The number of people living with Alzheimer’s is expected to triple by 2050, says Ikiz – and that’s without taking the climate crisis into account. “That scares me,” she says. “Because in 2050, we’ll be like: ‘Ah, this is awful. Let’s try to do something.’ But it will be too late for a lot of people.

    “I think that’s why it’s really important right now, as evidence is building, as we’re understanding more, to be speaking and raising awareness on these issues,” she says. “Because we don’t want to come to that point of irreversible damage.”

    For neuroscientists considering the climate problem, avoiding that point of no return implies investing in resilience research today. But this is not a story of climate anxiety and mental fortitude. “I’m not talking about psychological resilience,” says Nomura. “I’m talking about biological resilience.”

    A research agenda for climatological neuroepidemiology would probably bridge multiple fields and scales of analysis. It would merge insights from neurology, neurochemistry, environmental science, cognitive neuroscience and behavioural economics – from molecular dynamics to the individual brain to whole ecosystems. Nomura, for example, wants to understand how external environmental pressures influence brain health and cognitive development; who is most vulnerable to these pressures and when; and which preventive strategies might bolster neurological resilience against climate-induced stressors. Others want to price these stressors, so policymakers can readily integrate them into climate-action cost-benefit analyses.

    Wrecked houses along a beach.
    Storm devastation in Seaside Heights, New Jersey. Photograph: Mike Groll/AP

    For Nomura, it all comes back to stress. Under the right conditions, prenatal exposure to stress can be protective, she says. “It’s like an inoculation, right? You’re artificially exposed to something in utero and you become better at handling it – as long as it is not overwhelmingly toxic.” Stress in pregnancy, in moderation, can perhaps help immunise the foetus against the most deleterious effects of stress later in life. “But everybody has a breaking point,” she says.

    Identifying these breaking points is a core challenge of Nomura’s work. And it’s a particularly thorny challenge, in that as a matter of both research ethics and atmospheric physics, she and her colleagues can’t just gin up a hurricane and selectively expose expecting mothers to it. “Human research in this field is limited in a way. We cannot run the gold standard of randomised clinical trials,” she says. “We cannot do it. So we have to take advantage of this horrible natural disaster.”

    Recently, Nomura and her colleagues have begun to turn their attention to the developmental effects of heat. They will apply similar methods to those they applied to understanding the effects of Hurricane Sandy – establishing natural cohorts and charting the developmental trajectories in which they’re interested.

    The work necessarily proceeds slowly, in part because human research is further complicated by the fact that it takes people longer than animals to develop. Rats zoom through infancy and are sexually mature by about six weeks, whereas for humans it takes more than a decade. “That’s a reason this longitudinal study is really important – and a reason why we cannot just get started on the question right now,” says Nomura. “You cannot buy 10 years’ time. You cannot buy 12 years’ time.” You must wait. And so she waits, and she measures, as the waves continue to crash.

    Clayton Page Aldern’s book The Weight of Nature, on the effects of climate change on brain health, is published by Allen Lane on 4 April.

    Ditching ‘Anthropocene’: why ecologists say the term still matters (Nature)

    A aerial view of a section of the Niger river in Bamako clogged with plastic waste and other polluting materials.
    Plastic waste is clogging the Niger River in Bamako, Mali. After it sediments, plastic will become part of the geological record of human impacts on the planet. Credit: Michele Cattani/AFP via Getty

    Original article

    Beyond stratigraphic definitions, the name has broader significance for understanding humans’ place on Earth.

    David Adam

    14 March 2024

    After 15 years of discussion, geologists last week decided that the Anthropocene — generally understood to be the age of irreversible human impacts on the planet — will not become an official epoch in Earth’s geological timeline.

    The rejected proposal would have codified the end of the current Holocene epoch, which has been in place since the end of the last ice age 11,700 years ago. It suggested that the Anthropocene started in 1952, when plutonium from hydrogen-bomb tests showed up in the sediment of Crawford Lake near Toronto, Canada.

    The vote has drawn controversy over procedural details, and debate about its legitimacy continues. But whether or not it’s formally approved as a stratigraphic term, the idea of the Anthropocene is now firmly rooted in research. So, how are scientists using the term, and what does it mean to them and their fields?

    ‘It’s a term that belongs to everyone’

    As head of the Leverhulme Centre for Anthropocene Biodiversity at the University of York, UK, Chris Thomas has perhaps more riding on the term than most. “When the news of this — what sounds like a slightly dodgy vote — happened, I sort of wondered, is it the end of us? But I think not,” he says.

    For Thomas, the word Anthropocene neatly summarizes the sense that humans are part of Earth’s system and integral to its processes — what he calls indivisible connectedness. “That helps move us away from the notion that somehow humanity is apart from the rest of nature and natural systems,” he says. “It’s undoable — the change is everywhere.”

    The concept of an era of human-driven change also provides convenient common ground for him to collaborate with researchers from other disciplines. “This is something that people in the arts and humanities and the social sciences have picked up as well,” he says. “It is a means of enabling communication about the extent to which we are living in a truly unprecedented and human-altered world.”

    Seen through that lens, the fact that the Anthropocene has been formally rejected because scientists can’t agree on when it began seems immaterial. “Many people in the humanities who are using the phrase find the concept of the articulation of a particular year, based on a deposit in a particular lake, a ridiculous way of framing the concept of a human-altered planet.”

    Jacquelyn Gill, a palaeoecologist at the University of Maine in Orono, agrees. “It’s a term that belongs to everyone. To people working in philosophy and literary criticism, in the arts, in the humanities, the sciences,” she says. “I think it’s far more meaningful in the way that it is currently being used, than in any attempts that stratigraphers could have made to restrict or define it in some narrow sense.”

    She adds: “It serves humanity best as a loose concept that we can use to define something that we all widely understand, which is that we live in an era where humans are the dominant force on ecological and geological processes.”

    Capturing human influences

    The idea of the Anthropocene is especially helpful to make clear that humans have been shaping the planet for thousands of years, and that not all of those changes have been bad, Gill says. “We could do a better job of thinking about human–environment relationships in ways that are not inherently negative all the time,” she says. “People are not a monolith, and neither are our attitudes or relationships to nature.”

    Some 80% of biodiversity is currently stewarded on Indigenous lands, Gill points out. “Which should tell you something, right? That it’s not the presence of people that’s the problem,” she says. “The solution to those problems is changing the way that many dominant cultures relate to the natural world.”

    The concept of the Anthropocene is owned by many fields, Gill says. “This reiterates the importance of understanding that the role of people on our planet requires many different ways of knowing and many different disciplines.”

    In a world in which the threat of climate change dominates environmental debates, the term Anthropocene can help to broaden the discussion, says Yadvinder Malhi, a biodiversity researcher at the University of Oxford, UK.

    “I use it all the time. For me, it captures the time where human influence has a global planetary effect, and it’s multidimensional. It’s much more than just climate change,” he says. “It’s what we’re doing. The oceans, the resources we are extracting, habitats changing.”

    He adds: “I need that term when I’m trying to capture this idea of humans affecting the planet in multiple ways because of the size of our activity.”

    The looseness of the term is popular, but would a formal definition help in any way? Malhi thinks it would. “There’s no other term available that captures the global multidimensional impacts on the planet,” he says. “But there is a problem in not having a formal definition if people are using it in different terms, in different ways.”

    Although the word ‘Anthropocene’ makes some researchers think of processes that began 10,000 years ago, others consider it to mean those of the past century. “I think a formal adoption, like a definition, would actually help to clarify that.”

    doi: https://doi.org/10.1038/d41586-024-00786-2

    The Anthropocene is dead. Long live the Anthropocene (Science)

    Panel rejects a proposed geologic time division reflecting human influence, but the concept is here to stay

    Original article

    5 MAR 20244:00 PM ET

    BY PAUL VOOSEN

    A mushroom cloud rises in the night sky
    A 1953 nuclear weapons test in Nevada was among the human activities that could have marked the Anthropocene. NNSA/NEVADA FIELD OFFICE/SCIENCE SOURCE

    For now, we’re still in the Holocene.

    Science has confirmed that a panel of two dozen geologists has voted down a proposal to end the Holocene—our current span of geologic time, which began 11,700 years ago at the end of the last ice age—and inaugurate a new epoch, the Anthropocene. Starting in the 1950s, it would have marked a time when humanity’s influence on the planet became overwhelming. The vote, first reported by The New York Times, is a stunning—though not unexpected—rebuke for the proposal, which has been working its way through a formal approval process for more than a decade.

    “The decision is definitive,” says Philip Gibbard, a geologist at the University of Cambridge who is on the panel and serves as secretary-general of the International Commission on Stratigraphy (ICS), the body that governs the geologic timescale. “There are no outstanding issues to be resolved. Case closed.”

    The leaders of the Anthropocene Working Group (AWG), which developed the proposal for consideration by ICS’s Subcommission on Quaternary Stratigraphy, are not yet ready to admit defeat. They note that the online tally, in which 12 out of 18 subcommission members voted against the proposal, was leaked to the press without approval of the panel’s chair. “There remain several issues that need to be resolved about the validity of the vote and the circumstances surrounding it,” says Colin Waters, a geologist at the University of Leicester who chaired AWG.

    Few opponents of the Anthropocene proposal doubted the enormous impact that human influence, including climate change, is having on the planet. But some felt the proposed marker of the epoch—some 10 centimeters of mud from Canada’s Crawford Lake that captures the global surge in fossil fuel burning, fertilizer use, and atomic bomb fallout that began in the 1950s—isn’t definitive enough.

    Others questioned whether it’s even possible to affix one date to the start of humanity’s broad planetary influence: Why not the rise of agriculture? Why not the vast changes that followed European encroachment on the New World? “The Anthropocene epoch was never deep enough to understand human transformation of this planet,” says Erle Ellis, a geographer at the University of Maryland, Baltimore County who resigned last year in protest from AWG.

    Opponents also felt AWG made too many announcements to the press over the years while being slow to submit a proposal to the subcommission. “The Anthropocene epoch was pushed through the media from the beginning—a publicity drive,” says Stanley Finney, a stratigrapher at California State University Long Beach and head of the International Union of Geological Sciences, which would have had final approval of the proposal.

    Finney also complains that from the start, AWG was determined to secure an “epoch” categorization, and ignored or countered proposals for a less formal Anthropocene designation. If they had only made their formal proposal sooner, they could have avoided much lost time, Finney adds. “It would have been rejected 10 years earlier if they had not avoided presenting it to the stratigraphic community for careful consideration.”

    The Anthropocene backers will now have to wait for a decade before their proposal can be considered again. ICS has long instituted this mandatory cooling-off period, given how furious debates can turn, for example, over the boundary between the Pliocene and Pleistocene, and whether the Quaternary—our current geologic period, a category above epochs—should exist at all.

    Even if it is not formally recognized by geologists, the Anthropocene is here to stay. It is used in art exhibits, journal titles, and endless books. And Gibbard, Ellis, and others have advanced the view that it can remain an informal geologic term, calling it the “Anthropocene event.” Like the Great Oxygenation Event, in which cyanobacteria flushed the atmosphere with oxygen billions of years ago, the Anthropocene marks a huge transition, but one without an exact date. “Let us work together to ensure the creation of a far deeper and more inclusive Anthropocene event,” Ellis says.

    Waters and his colleagues will continue to press that the Anthropocene is worthy of recognition in the geologic timescale, even if that advocacy has to continue in an informal capacity, he says. Although small in size, Anthropocene strata such as the 10 centimeters of lake mud are distinct and can be traced using more than 100 durable geochemical signals, he says. And there is no going back to where the planet was 100 years ago, he says. “The Earth system changes that mark the Anthropocene are collectively irreversible.”


    doi: 10.1126/science.z3wcw7b

    Are We in the ‘Anthropocene,’ the Human Age? Nope, Scientists Say. (New York Times)

    A panel of experts voted down a proposal to officially declare the start of a new interval of geologic time, one defined by humanity’s changes to the planet.

    Four people standing on the deck of a ship face a large, white mushroom cloud in the distance.
    In weighing their decision, scientists considered the effect on the world of nuclear activity. A 1946 test blast over Bikini atoll. Credit: Jack Rice/Associated Press

    Original article

    By Raymond Zhong

    March 5, 2024

    The Triassic was the dawn of the dinosaurs. The Paleogene saw the rise of mammals. The Pleistocene included the last ice ages.

    Is it time to mark humankind’s transformation of the planet with its own chapter in Earth history, the “Anthropocene,” or the human age?

    Not yet, scientists have decided, after a debate that has spanned nearly 15 years. Or the blink of an eye, depending on how you look at it.

    A committee of roughly two dozen scholars has, by a large majority, voted down a proposal to declare the start of the Anthropocene, a newly created epoch of geologic time, according to an internal announcement of the voting results seen by The New York Times.

    By geologists’ current timeline of Earth’s 4.6-billion-year history, our world right now is in the Holocene, which began 11,700 years ago with the most recent retreat of the great glaciers. Amending the chronology to say we had moved on to the Anthropocene would represent an acknowledgment that recent, human-induced changes to geological conditions had been profound enough to bring the Holocene to a close.

    The declaration would shape terminology in textbooks, research articles and museums worldwide. It would guide scientists in their understanding of our still-unfolding present for generations, perhaps even millenniums, to come.

    In the end, though, the members of the committee that voted on the Anthropocene over the past month were not only weighing how consequential this period had been for the planet. They also had to consider when, precisely, it began.

    By the definition that an earlier panel of experts spent nearly a decade and a half debating and crafting, the Anthropocene started in the mid-20th century, when nuclear bomb tests scattered radioactive fallout across our world. To several members of the scientific committee that considered the panel’s proposal in recent weeks, this definition was too limited, too awkwardly recent, to be a fitting signpost of Homo sapiens’s reshaping of planet Earth.

    “It constrains, it confines, it narrows down the whole importance of the Anthropocene,” said Jan A. Piotrowski, a committee member and geologist at Aarhus University in Denmark. “What was going on during the onset of agriculture? How about the Industrial Revolution? How about the colonizing of the Americas, of Australia?”

    “Human impact goes much deeper into geological time,” said another committee member, Mike Walker, an earth scientist and professor emeritus at the University of Wales Trinity Saint David. “If we ignore that, we are ignoring the true impact, the real impact, that humans have on our planet.”

    Hours after the voting results were circulated within the committee early Tuesday, some members said they were surprised at the margin of votes against the Anthropocene proposal compared with those in favor: 12 to four, with two abstentions. (Another three committee members neither voted nor formally abstained.)

    Even so, it was unclear on Tuesday whether the results stood as a conclusive rejection or whether they might still be challenged or appealed. In an email to The Times, the committee’s chair, Jan A. Zalasiewicz, said there were “some procedural issues to consider” but declined to discuss them further. Dr. Zalasiewicz, a geologist at the University of Leicester, has expressed support for canonizing the Anthropocene.

    This question of how to situate our time in the narrative arc of Earth history has thrust the rarefied world of geological timekeepers into an unfamiliar limelight.

    The grandly named chapters of our planet’s history are governed by a body of scientists, the International Union of Geological Sciences. The organization uses rigorous criteria to decide when each chapter started and which characteristics defined it. The aim is to uphold common global standards for expressing the planet’s history.

    A man stands next to a machine with tubing and lines of plastic that end up in a shallow pool of water.
    Polyethylene being extruded and fed into a cooling bath during plastics manufacture, circa 1950. Credit: Hulton Archive, via Getty Images

    Geoscientists don’t deny our era stands out within that long history. Radionuclides from nuclear tests. Plastics and industrial ash. Concrete and metal pollutants. Rapid greenhouse warming. Sharply increased species extinctions. These and other products of modern civilization are leaving unmistakable remnants in the mineral record, particularly since the mid-20th century.

    Still, to qualify for its own entry on the geologic time scale, the Anthropocene would have to be defined in a very particular way, one that would meet the needs of geologists and not necessarily those of the anthropologists, artists and others who are already using the term.

    That’s why several experts who have voiced skepticism about enshrining the Anthropocene emphasized that the vote against it shouldn’t be read as a referendum among scientists on the broad state of the Earth. “This was a narrow, technical matter for geologists, for the most part,” said one of those skeptics, Erle C. Ellis, an environmental scientist at the University of Maryland, Baltimore County. “This has nothing to do with the evidence that people are changing the planet,” Dr. Ellis said. “The evidence just keeps growing.”

    Francine M.G. McCarthy, a micropaleontologist at Brock University in St. Catharines, Ontario, is the opposite of a skeptic: She helped lead some of the research to support ratifying the new epoch.

    “We are in the Anthropocene, irrespective of a line on the time scale,” Dr. McCarthy said. “And behaving accordingly is our only path forward.”

    The Anthropocene proposal got its start in 2009, when a working group was convened to investigate whether recent planetary changes merited a place on the geologic timeline. After years of deliberation, the group, which came to include Dr. McCarthy, Dr. Ellis and some three dozen others, decided that they did. The group also decided that the best start date for the new period was around 1950.

    The group then had to choose a physical site that would most clearly show a definitive break between the Holocene and the Anthropocene. They settled on Crawford Lake, in Ontario, where the deep waters have preserved detailed records of geochemical change within the sediments at the bottom.

    Last fall, the working group submitted its Anthropocene proposal to the first of three governing committees under the International Union of Geological Sciences. Sixty percent of each committee has to approve the proposal for it to advance to the next.

    The members of the first one, the Subcommission on Quaternary Stratigraphy, submitted their votes starting in early February. (Stratigraphy is the branch of geology concerned with rock layers and how they relate in time. The Quaternary is the ongoing geologic period that began 2.6 million years ago.)

    Under the rules of stratigraphy, each interval of Earth time needs a clear, objective starting point, one that applies worldwide. The Anthropocene working group proposed the mid-20th century because it bracketed the postwar explosion of economic growth, globalization, urbanization and energy use. But several members of the subcommission said humankind’s upending of Earth was a far more sprawling story, one that might not even have a single start date across every part of the planet.

    Two cooling towers, a square building and a larger building behind it with smokestacks and industrial staircases on the outside.
    The world’s first full-scale atomic power station in Britain in 1956. Credit: Hulton Archive, via Getty Images

    This is why Dr. Walker, Dr. Piotrowski and others prefer to describe the Anthropocene as an “event,” not an “epoch.” In the language of geology, events are a looser term. They don’t appear on the official timeline, and no committees need to approve their start dates.

    Yet many of the planet’s most significant happenings are called events, including mass extinctions, rapid expansions of biodiversity and the filling of Earth’s skies with oxygen 2.1 to 2.4 billion years ago.

    Even if the subcommission’s vote is upheld and the Anthropocene proposal is rebuffed, the new epoch could still be added to the timeline at some later point. It would, however, have to go through the whole process of discussion and voting all over again.

    Time will march on. Evidence of our civilization’s effects on Earth will continue accumulating in the rocks. The task of interpreting what it all means, and how it fits into the grand sweep of history, might fall to the future inheritors of our world.

    “Our impact is here to stay and to be recognizable in the future in the geological record — there is absolutely no question about this,” Dr. Piotrowski said. “It will be up to the people that will be coming after us to decide how to rank it.”

    Raymond Zhong reports on climate and environmental issues for The Times.

    Latest News on Climate Change and the Environment

    Protecting groundwater. After years of decline in the nation’s groundwater, a series of developments indicate that U.S. state and federal officials may begin tightening protections for the dwindling resource. In Nevada, Idaho and Montana, court decisions have strengthened states’ ability to restrict overpumping. California is considering penalizing officials for draining aquifers. And the White House has asked scientists to advise how the federal government can help.

    Weather-related disasters. An estimated 2.5 million people were forced from their homes in the United States by weather-related disasters in 2023, according to new data from the Census Bureau. The numbers paint a more complete picture than ever before of the lives of people affected by such events as climate change supercharges extreme weather.

    Amazon rainforest. Up to half of the Amazon rainforest could transform into grasslands or weakened ecosystems in the coming decades, a new study found, as climate change, deforestation and severe droughts damage huge areas beyond their ability to recover. Those stresses in the most vulnerable parts of the rainforest could eventually drive the entire forest ecosystem past a tipping point that would trigger a forest-wide collapse, researchers said.

    A significant threshold. Over the past 12 months, the average temperature worldwide was more than 1.5 degrees Celsius, or 2.7 degrees Fahrenheit, higher than it was at the dawn of the industrial age. That number carries special significance, as nations agreed under the 2015 Paris Agreement to try to keep the difference between average temperatures today and in preindustrial times to 1.5 degrees Celsius, or at least below 2 degrees Celsius.

    New highs. The exceptional warmth that first enveloped the planet last summer is continuing strong into 2024: Last month clocked in as the hottest January ever measured, and the hottest January on record for the oceans, too. Sea surface temperatures were just slightly lower than in August 2023, the oceans’ warmest month on the books.

    Polémica con el Antropoceno: la humanidad todavía no sabe en qué época geológica vive (El País)

    elpais.com

    Un comité de expertos ha tumbado la propuesta de declarar un nuevo momento geológico, pero el propio presidente denuncia irregularidades en la votación

    Manuel Ansede

    Madrid –

    Extracción de un testigo de sedimentos del fondo del lago Crawford, a las afueras de Toronto (Canadá). TIM PATTERSON / UNIVERSIDAD DE CARLETON

    La idea del Antropoceno —que la humanidad vive desde 1950 en una nueva época geológica caracterizada por la contaminación humana— se ha hecho tan popular en los últimos años que hasta la Real Academia Española adoptó el término en el Diccionario de la Lengua en 2021. Los académicos se dieron esta vez demasiada prisa. El concepto sigue en el aire, en medio de una vehemente polémica entre especialistas. Miembros del comité de expertos que debe tomar la decisión en la Unión Internacional de Ciencias Geológicas (UICG) —la Subcomisión de Estratigrafía del Cuaternario— han filtrado este martes al diario The New York Times que han votado mayoritariamente en contra de reconocer la existencia del Antropoceno. Sin embargo, el presidente de la Subcomisión, el geólogo Jan Zalasiewicz, explica a EL PAÍS que el resultado preliminar de la votación se ha anunciado sin su autorización y que todavía quedan “algunos asuntos pendientes con los votos que hay que resolver”. La humanidad todavía no sabe en qué época geológica vive.

    El químico holandés Paul Crutzen, ganador del Nobel de Química por iluminar el agujero de la capa de ozono, planteó en el año 2000 que el planeta había entrado en una nueva época, provocada por el impacto brutal de los seres humanos. Un equipo internacional de especialistas, el Grupo de Trabajo del Antropoceno, ha analizado los hechos científicos desde 2009 y el año pasado presentó una propuesta para proclamar oficialmente esta nueva época geológica, marcada por la radiactividad de las bombas atómicas y los contaminantes procedentes de la quema de carbón y petróleo. El diminuto lago Crawford, a las afueras de Toronto (Canadá), era el lugar indicado para ejemplificar el inicio del Antropoceno, gracias a los sedimentos de su fondo, imperturbados desde hace siglos.

    La mayoría de los miembros de la Subcomisión de Estratigrafía del Cuaternario de la UICG ha votado en contra de la propuesta, según el periódico estadounidense. El geólogo británico Colin Waters, líder del Grupo de Trabajo del Antropoceno, explica a EL PAÍS que se ha enterado por la prensa. “Todavía no hemos recibido una confirmación oficial directamente del secretario de la Subcomisión de Estratigrafía del Cuaternario. Parece que The New York Times recibe los resultados antes que nosotros, es muy decepcionante”, lamenta Waters.

    El geólogo reconoce que el dictamen, si se confirma, sería el fin de su propuesta actual, pero no se rinde. “Tenemos muchos investigadores eminentes que desean continuar como grupo, de manera informal, defendiendo las evidencias de que el Antropoceno debería ser formalizado como una época”, afirma. A su juicio, los estratos geológicos actuales —contaminados por isótopos radiactivos, microplásticos, cenizas y pesticidas— han cambiado de manera irreversible respecto a los del Holoceno, la época geológica iniciada hace más de 10.000 años, tras la última glaciación. “Dadas las pruebas existentes, que siguen aumentando, no me sorprendería un futuro llamamiento a reconsiderar nuestra propuesta”, opina Waters, de la Universidad de Leicester.

    El jefe del Grupo de Trabajo del Antropoceno sostiene que hay “algunas cuestiones de procedimiento” que ponen en duda la validez de la votación. La geóloga italiana Silvia Peppoloni, jefa de la Comisión de Geoética de la UICG, confirma que su equipo ha realizado un informe sobre esta pelea entre la Subcomisión de Estratigrafía del Cuaternario y el Grupo de Trabajo del Antropoceno. El documento está sobre la mesa del presidente de la UICG, el británico John Ludden.

    La geóloga canadiense Francine McCarthy estaba convencida de que el lago Crawford convencería a los escépticos. Desde fuera parece pequeño, con apenas 250 metros de largo, pero su profundidad roza los 25 metros. Sus aguas superficiales no se mezclan con las de su lecho, por lo que el suelo del fondo se puede analizar como una lasaña, en la que cada capa acumula sedimentos procedentes de la atmósfera. Ese calendario subacuático del lago Crawford revela la denominada Gran Aceleración, el momento alrededor de 1950 en el que la humanidad empezó a dejar una huella cada vez más evidente, con el lanzamiento de bombas atómicas, la quema masiva de petróleo y carbón y la extinción de especies.

    “Ignorar el enorme impacto de los humanos en nuestro planeta desde mediados del siglo XX tiene potencialmente consecuencias dañinas, al minimizar la importancia de los datos científicos para hacer frente al evidente cambio en el sistema de la Tierra, como ya señaló Paul Crutzen hace casi 25 años”, advierte McCarthy.

    Em votação, cientistas negam que estejamos no Antropoceno, a época geológica dos humanos (Folha de S.Paulo)

    www1.folha.uol.com.br

    Grupo rejeitou que mudanças sejam profundas o bastante para encerrar o Holoceno

    Raymond Zhong

    5 de março de 2024


    O Triássico foi o amanhecer dos dinossauros. O Paleogeno viu a ascensão dos mamíferos. O Pleistoceno incluiu as últimas eras glaciais.

    Está na hora de marcar a transformação da humanidade no planeta com seu próprio capítulo na história da Terra, o “Antropoceno”, ou a época humana?

    Ainda não, decidiram os cientistas, após um debate que durou quase 15 anos. Ou um piscar de olhos, dependendo do ângulo pelo qual você olha.

    Um comitê de cerca de duas dezenas de estudiosos votou, em grande maioria, contra uma proposta de declarar o início do Antropoceno, uma época recém-criada do tempo geológico, de acordo com um anúncio interno dos resultados da votação visto pelo The New York Times.

    Pela linha do tempo atual dos geólogos da história de 4,6 bilhões de anos da Terra, nosso mundo agora está no Holoceno, que começou há 11,7 mil anos com o recuo mais recente dos grandes glaciares.

    Alterar a cronologia para dizer que avançamos para o Antropoceno representaria um reconhecimento de que as mudanças recentes induzidas pelo homem nas condições geológicas foram profundas o suficiente para encerrar o Holoceno.

    A declaração moldaria a terminologia em livros didáticos, artigos de pesquisa e museus em todo o mundo. Orientaria os cientistas em sua compreensão do nosso presente ainda em desenvolvimento por gerações, talvez até por milênios.

    No fim das contas, porém, os membros do comitê que votaram sobre o Antropoceno nas últimas semanas não estavam apenas considerando o quão determinante esse período havia sido para o planeta. Eles também tiveram que considerar quando, precisamente, ele começou.

    Pela definição que um painel anterior de especialistas passou quase uma década e meia debatendo e elaborando, o Antropoceno começou na metade do século 20, quando testes de bombas nucleares espalharam material radioativo por todo o nosso mundo.

    Para vários membros do comitê científico que avaliaram a proposta do painel nas últimas semanas, essa definição era muito limitada, muito recente e inadequada para ser um marco adequado da remodelação do Homo sapiens no planeta Terra.

    “Isso restringe, confina, estreita toda a importância do Antropoceno”, disse Jan A. Piotrowski, membro do comitê e geólogo da Universidade de Aarhus, na Dinamarca. “O que estava acontecendo durante o início da agricultura? E a Revolução Industrial? E a colonização das Américas, da Austrália?”

    “O impacto humano vai muito mais fundo no tempo geológico”, disse outro membro do comitê, Mike Walker, cientista da Terra e professor emérito da Universidade de Gales Trinity Saint David. “Se ignorarmos isso, estamos ignorando o verdadeiro impacto que os humanos têm em nosso planeta.”

    Horas após a circulação dos resultados da votação dentro do comitê nesta terça-feira (5) de manhã, alguns membros disseram que ficaram surpresos com a margem de votos contra a proposta do Antropoceno em comparação com os a favor: 12 a 4, com 2 abstenções.

    Mesmo assim, nesta terça de manhã não ficou claro se os resultados representavam uma rejeição conclusiva ou se ainda poderiam ser contestados ou apelados. Em um e-mail para o Times, o presidente do comitê, Jan A. Zalasiewicz, disse que havia “algumas questões procedimentais a considerar”, mas se recusou a discuti-las mais a fundo.

    Zalasiewicz, geólogo da Universidade de Leicester, expressou apoio à canonização do Antropoceno.

    Essa questão de como situar nosso tempo na narrativa da história da Terra colocou o mundo dos guardiões do tempo geológico sob uma luz desconhecida.

    Os capítulos grandiosamente nomeados da história de nosso planeta são governados por um grupo de cientistas, a União Internacional de Ciências Geológicas. A organização usa critérios rigorosos para decidir quando cada capítulo começou e quais características o definiram. O objetivo é manter padrões globais comuns para expressar a história do planeta.

    Os geocientistas não negam que nossa era se destaca dentro dessa longa história. Radionuclídeos de testes nucleares. Plásticos e cinzas industriais. Poluentes de concreto e metal. Aquecimento global rápido. Aumento acentuado de extinções de espécies. Esses e outros produtos da civilização moderna estão deixando vestígios inconfundíveis no registro mineral, especialmente desde meados do século 20.

    Ainda assim, para se qualificar para a entrada na escala de tempo geológico, o Antropoceno teria que ser definido de uma maneira muito específica, que atendesse às necessidades dos geólogos e não necessariamente dos antropólogos, artistas e outros que já estão usando o termo.

    Por isso, vários especialistas que expressaram ceticismo quanto à consagração do Antropoceno enfatizaram que o voto contra não deve ser interpretado como um referendo entre cientistas sobre o amplo estado da Terra.

    “Este é um assunto específico e técnico para os geólogos, em sua maioria”, disse um desses céticos, Erle C. Ellis, um cientista ambiental da Universidade de Maryland. “Isso não tem nada a ver com a evidência de que as pessoas estão mudando o planeta”, afirmou Ellis. “A evidência continua crescendo.”

    Francine M.G. McCarthy, micropaleontóloga da Universidade Brock em St. Catharines, Ontário (Canadá), é tem visão oposta: ela ajudou a liderar algumas das pesquisas para apoiar a ratificação da nova época.

    “Estamos no Antropoceno, independentemente de uma linha na escala de tempo”, disse McCarthy. “E agir de acordo é o nosso único caminho a seguir.”

    A proposta do Antropoceno teve início em 2009, quando um grupo de trabalho foi convocado para investigar se as recentes mudanças planetárias mereciam um lugar na linha do tempo geológica.

    Após anos de deliberação, o grupo, que passou a incluir McCarthy, Ellis e cerca de três dezenas de outros, decidiu que sim. O grupo também decidiu que a melhor data de início para o novo período era por volta de 1950.

    O grupo então teve que escolher um local físico que mostrasse de forma mais clara uma quebra definitiva entre o Holoceno e o Antropoceno. Eles escolheram o Lago Crawford, em Ontário, no Canadá, onde as águas profundas preservaram registros detalhados de mudanças geoquímicas nos sedimentos do fundo.

    No outono passado, o grupo de trabalho enviou sua proposta do Antropoceno para o primeiro dos três comitês governantes da União Internacional de Ciências Geológicas —60% de cada comitê precisam aprovar a proposta para que ela avance para o próximo.

    Os membros do primeiro comitê, a Subcomissão de Estratigrafia do Quaternário, enviaram seus votos a partir do início de fevereiro. (Estratigrafia é o ramo da geologia que se dedica ao estudo das camadas de rocha e como elas se relacionam no tempo. O Quaternário é o período geológico em curso que começou há 2,6 milhões de anos.)

    De acordo com as regras da estratigrafia, cada intervalo de tempo da Terra precisa de um ponto de partida claro e objetivo, que se aplique em todo o mundo. O grupo de trabalho do Antropoceno propôs meados do século 20 porque isso abrangia a explosão do crescimento econômico pós-guerra, a globalização, a urbanização e o uso de energia.

    Mas vários membros da subcomissão disseram que a transformação da humanidade na Terra era uma história muito mais abrangente, que talvez nem tenha uma única data de início em todas as partes do planeta.

    Por isso, Walker, Piotrowski e outros preferem descrever o Antropoceno como um “evento”, não como uma “época”. Na linguagem da geologia, eventos são um termo mais amplo. Eles não aparecem na linha do tempo oficial, e nenhum comitê precisa aprovar suas datas de início.

    No entanto, muitos dos acontecimentos mais significativos do planeta são chamados de eventos, incluindo extinções em massa, expansões rápidas da biodiversidade e o preenchimento dos céus da Terra com oxigênio há 2,1 bilhões a 2,4 bilhões de anos.

    Mesmo que o voto da subcomissão seja mantido e a proposta do Antropoceno seja rejeitada, a nova época ainda poderá ser adicionada à linha do tempo em algum momento posterior. No entanto, terá que passar por todo o processo de discussão e votação novamente.

    Q&A: To Save The Planet, Traditional Indigenous Knowledge Is Indispensable (Inside Climate News)

    Politics & Policy

    Indigenous peoples’ ecological expertise honed over centuries is increasingly being used by policymakers to complement mainstream science.

    By Katie Surma

    February 14, 2024

    A member of the Indigenous Baduy tribe works at his field on Indonesia's Java island. Anthropologist Gonzalo Oviedo says Indigenous communities in Southeast Asia “tend to recognize many more varieties of plant subspecies.” Credit: Bay Ismoyo/AFP via Getty Images
    A member of the Indigenous Baduy tribe works at his field on Indonesia’s Java island. Anthropologist Gonzalo Oviedo says Indigenous communities in Southeast Asia “tend to recognize many more varieties of plant subspecies.” Credit: Bay Ismoyo/AFP via Getty Images

    The past few years have been a triumph for traditional Indigenous knowledge, the body of observations, innovations and practices developed by Indigenous peoples throughout history with regard to their local environment. 

    First, the world’s top scientific and environmental policymaking bodies embraced it. Then, in 2022, the Biden administration instructed U.S. federal agencies to include it in their decision making processes. And, last year, the National Science Foundation announced $30 million in grants to fund it.

    Traditional Indigenous knowledge, also called traditional ecological knowledge or traditional knowledge, is compiled by tribes according to their distinct culture and generally is transmitted orally between generations. It has evolved since time immemorial, yet mainstream institutions have only begun to recognize its value for helping to address pressing global problems like climate change and biodiversity loss, to say nothing of its cultural importance.  

    Traditional Indigenous knowledge has helped communities sustainably manage territories and natural resources—from predicting natural disasters to protecting biologically important areas and identifying medicinal plants. Today, more than a quarter of land globally is occupied, managed or owned by Indigenous peoples and local communities, with roughly 80 percent of Earth’s biodiversity located on Indigenous territories. Study after study has confirmed that those lands have better environmental outcomes than alternatives. 

    But, just as the links between those outcomes and Indigenous expertise are becoming more widely acknowledged, the communities stewarding this knowledge are coming under increasing threat from land grabbing, rapid cultural changes and other factors.

    Then there is the backlash from the right and the left. As traditional Indigenous knowledge has moved into the mainstream alongside science for a better understanding and management of the natural world, critics on all sides have emerged. Some have argued that just as Christian creationism is incompatible with science, so too is traditional knowledge—this argument is widely seen as premised on a misunderstanding about what traditional knowledge is. On the other end of the ideological spectrum, some progressives have balked at the notion that there are fundamental differences between the two systems. 

    For a better understanding of what traditional knowledge is, Inside Climate News spoke with Gonzalo Oviedo, an anthropologist and environmental scientist who has worked on social aspects of conservation for more than three decades. This conversation has been lightly edited for clarity and length.

    For people who may not know much about traditional knowledge, can you give some examples of what it is? 

    One key element of traditional knowledge is the understanding of where key biodiversity areas are located in the landscape where communities have traditionally lived. 

    This is exactly what conservation science does: identify areas that contain important genetic resources, or areas that contain important features that influence the rest of the ecosystem. 

    Traditional cultures do exactly this with areas that are key for the reproduction of animal species, for conserving water sources or for harboring certain types of plants including medicinal plants. Often, those areas become sacred places that Indigenous communities protect very rigorously. Protecting those key biodiversity areas is one of the most important management practices and it’s based on an understanding of how an ecosystem works in a given area. 

    Another element is closely related to the work of botanists, which is the creation of very sophisticated botanic taxonomy (the systematic classification of organisms). There are taxonomic systems generated by Indigenous peoples that are more sophisticated than mainstream taxonomy. In Southeast Asia, for example, Indigenous communities tend to recognize many more varieties of plant subspecies based on their practices and lifeways. They see the plants in a more detailed way and notice more differences. They also have more linguistic terms for diverse shades of green that represent different types of plants. 

    A third element is the understanding of the biological succession of forests and other ecosystems. Communities have very detailed knowledge of how ecosystems have changed and evolved over long periods of time. People who live within ecosystems, and in a way where their livelihoods are connected to the ecosystem, are a fundamental source of direct knowledge of how ecosystems evolve. 

    In places like the Arctic, where people are dependent on their ability to predict changes in the climate, there has been a lot of important research done with Indigenous communities to systematize their climate knowledge. In dry land climates, where traditional communities are very vulnerable to changes in precipitation, they’ve identified key biodiversity areas that serve as reservoirs for periods when droughts are prolonged and these communities strictly protect those reservoirs. Fishing communities in the Pacific are extremely knowledgeable about marine biodiversity and the management of those ecosystems.

    What developments have contributed to more mainstream acceptance of traditional knowledge? It’s hard to imagine that Indigenous peoples’ advocacy for stronger protection of their rights hasn’t played a role. Have there been other developments contributing to the growing recognition of the value of this knowledge system to global conservation efforts? 

    The process of integrating traditional knowledge into the mainstream is still relatively new. Only in the last 20 years or so has there been more significant progress on this. The Convention on Biological Diversity, the CBD, entered into force in 1993 and has a very important provision in Article 8(j) on the recognition of traditional knowledge and the need to “respect, preserve and maintain” it. As a result of that provision, there has been a lot of interest in how to integrate that into public policy, biodiversity management and related fields. 

    The evolution of nature conservation paradigms in the last 20 to 30 years or so has also been an important driver. Three decades ago it was still very difficult to get conservation organizations to recognize that the traditional knowledge of Indigenous and local communities is a positive factor for conservation and that working together with those communities is fundamental. Today, the conservation movement universally agrees to this.

    When you say “evolution of nature conservation paradigms,” are you referring to the shift away from “fortress conservation,” or the model where protected areas were fenced off and Indigenous and local communities removed from their traditional lands in the name of conservation? 

    There have been several factors contributing to the change and moving away from the fortress conservation concept to inclusive conservation has been one of them. By inclusive I mean the understanding that Indigenous and community held lands are better protected through traditional management practices and the value of traditional knowledge associated with that. 

    It is also better recognized today that working for sustainable livelihoods like subsistence farming and harvesting is good for conservation. In the past, livelihood activities were seen as a threat to conservation. Today, it is widely accepted that by supporting sustainable livelihoods, you’re supporting conservation as well. 

    Also, today, it is recognized that humans have always managed ecosystems. The concept of “empty wilderness” is no longer viable for conservation and it’s not true in most parts of the world. These are several ways that the conservation paradigm is evolving. It’s safe to say that not everyone is on the same page. But things are evolving in the direction of inclusiveness. 

    What are some of the biggest challenges to ensuring that traditional knowledge is protected and, if approved by communities, transmitted for use in mainstream conservation efforts?

    There are two main challenges. One relates to how other knowledge systems see traditional knowledge. 

    This is essentially the problem of getting people to understand what traditional knowledge is, and overcoming unhelpful and incorrect stereotypes about it. For example, some people say that, unlike science, traditional knowledge is not based on evidence or is not based on credible scientific processes that allow for verification. That is not necessarily true. 

    There are, of course, differences between traditional knowledge and scientific knowledge. Traditional knowledge tends to use more qualitative methods and less quantitative approaches and methodologies compared to what science does today. 

    But there are several aspects in which both are quite similar. To start, the key motivation in both systems is problem solving. The intellectual process of both sometimes works through comparisons and applies methods of trial and error. You also have in both the process of moving from practical knowledge to abstraction, and also feedback looping and adaptive learning.

    Misunderstandings or stereotypes about what traditional knowledge is have led to unfriendly public policies in natural resource management and education systems. 

    To address this, institutions like the Intergovernmental Panel on Climate Change (IPCC) and the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) need to continue collecting evidence and information about traditional knowledge and communicate its value to policymakers. That is still a fundamental need. 

    A second major challenge relates to the erosion of the intergenerational transmission of traditional knowledge. That transmission mostly happens through oral systems that require direct physical contact between different generations. That is being lost because of demographic changes, migration and the use of formal education systems that take children into schools and separate them physically from transmitters of traditional knowledge. This is a serious problem but there are examples of helpful actions that have been implemented in places like Ecuador, where the formal education system works together with Indigenous communities under an inter-cultural model. 

    Another aspect of this is the loss of knowledge. If there is lack of transmission or insufficient transmission between generations, when elders die a significant amount of knowledge dies with them.

    Cultural change is also a factor. People are coming into contact with other forms of knowledge, some that are presented in a more dynamic way, like on television, and that tend to capture the attention of younger people. 

    The pace of change is happening so fast. Traditional knowledge is transmitted slowly through in person contact and in the context of daily life. If the pace of cultural change isn’t managed, and communities aren’t supported in their maintenance of knowledge transmission, then that knowledge will be irreversibly lost. 

    There has been some pushback to the incorporation of traditional knowledge alongside science in policy making and into education curriculums. Critics have analogized traditional knowledge with “creationism.”  What do you make of this? 

    It’s important to understand precisely what traditional knowledge is and to differentiate it from spirituality. 

    Communities often connect spirituality with traditional knowledge. Spirituality is part of the traditional life of the communities, but spirituality is not in itself traditional knowledge. For example, people in Laos fishing communities that live around wetlands have a sophisticated knowledge of how wetlands function. They have for generations fished and taken resources from the wetlands.

    Based on their traditional knowledge of the wetlands, they understand the need for rules to avoid depletion of fish populations by preserving key areas for reproduction and ecological processes. They have developed a set of norms so people understand they cannot fish in certain areas, and those norms take place through spirituality. They say, “You can’t fish in this area because this is where our spirits live and these spirits shouldn’t be interfered with.” This becomes a powerful norm because it connects with a deep spiritual value of the community. 

    This doesn’t mean that when recognizing the traditional knowledge of the community, one has to take the topic of the spirits as knowledge that has to be validated. The spiritual aspect is the normative part, articulated around beliefs, it is not the knowledge. The same thing goes for practices protecting key biodiversity areas. Traditional cultures all over the world have sacred sites, waters, and they are based on some knowledge of how the ecosystem works and the need to protect key and sensitive areas. Traditional knowledge is essentially problem solving, practical and develops through empirical processes of observation and experience. You have to distinguish it from spirituality, that develops through stories, myths and visions from spiritual leaders. 

    The relationship between knowledge and spiritual beliefs happened in a similar way in the history of western science and with traditional Chinese medicine. Historically, you will find that Chinese medical science was intimately linked to Taoist religion and Confucianism. Yet the value of Chinese medicine doesn’t mean that you have to adopt Taoism or Confucianism. It takes a long time for societies to understand how to distinguish these things because their connections are very complex. 

    What is at stake if traditional knowledge is lost? 

    First, that would be a loss for all of humanity. There has been recent research showing that traditional knowledge can benefit the whole of society if understood and transmitted to other knowledge systems.

    There are certain aspects of traditional knowledge that, if lost, will be difficult to recuperate like elements of botanic taxonomy that are not recorded. If lost, we’re losing an important part of human knowledge. 

    Second, traditional knowledge is important for cultures that have generated and use that knowledge, especially for their adaptation to climatic and other changes. If properly recognized and supported, that knowledge can be a factor of positive development and evolution for those communities. Change is happening everywhere and will continue to happen in traditional societies. But there are different types of cultural change and some are destructive to traditional communities, like the absorption of invasive external values and mythologies that completely destroy young peoples’ cultural background and erode the fabrics of traditional societies. 

    There is also cultural change that can be positive if it is well managed. Young peoples’ use of technology could be a good source of change if it is used to help maintain and transmit their traditional culture. That can prompt pride and value in communities, and promote intercultural understanding which is fundamental in a world where there is still so much cultural discrimination against Indigenous peoples and a lack of understanding of their cultures and value systems.

    Traditional knowledge can play an important role in intercultural dialogues. We need healing processes within societies so that cultures can speak to each other on equal footing, which unfortunately isn’t the case in many places today. 

    Katie Surma – Reporter, Pittsburgh

    Katie Surma is a reporter at Inside Climate News focusing on international environmental law and justice. Before joining ICN, she practiced law, specializing in commercial litigation. She also wrote for a number of publications and her stories have appeared in the Washington Post, USA Today, Chicago Tribune, Seattle Times and The Associated Press, among others. Katie has a master’s degree in investigative journalism from Arizona State University’s Walter Cronkite School of Journalism, an LLM in international rule of law and security from ASU’s Sandra Day O’Connor College of Law, a J.D. from Duquesne University, and was a History of Art and Architecture major at the University of Pittsburgh. Katie lives in Pittsburgh, Pennsylvania, with her husband, Jim Crowell.

    The Weather Man (Stanford Magazine)

    Daniel Swain studies extreme floods. And droughts. And wildfires. Then he explains them to the rest of us.

    February 6, 2024

        

    An illustration of Daniel Swain walking through the mountains and clouds.

    By Tracie White

    Illustrations by Tim O’Brien

    7:00 a.m., 45 degrees F

    The moment Daniel Swain wakes up, he gets whipped about by hurricane-force winds. “A Category 5, literally overnight, hits Acapulco,” says the 34-year-old climate scientist and self-described weather geek, who gets battered daily by the onslaught of catastrophic weather headlines: wildfires, megafloods, haboobs (an intense dust storm), atmospheric rivers, bomb cyclones. Everyone’s asking: Did climate change cause these disasters? And, more and more, they want Swain to answer.

    Swain, PhD ’16, rolls over in bed in Boulder, Colo., and checks his cell phone for emails. Then, retainer still in his mouth, he calls back the first reporter of the day. It’s October 25, and Isabella Kwai at the New York Times wants to know whether climate change is responsible for the record-breaking speed and ferocity of Hurricane Otis, which rapidly intensified and made landfall in Acapulco as the eastern Pacific’s strongest hurricane on record. It caught everyone off guard. Swain posted on X (formerly known as Twitter) just hours before the storm hit: “A tropical cyclone undergoing explosive intensification unexpectedly on final approach to a major urban area . . . is up there on list of nightmare weather scenarios becoming more likely in a warming #climate.”

    Swain is simultaneously 1,600 miles away from the tempest and at the eye of the storm. His ability to explain science to the masses—think the Carl Sagan of weather—has made him one of the media’s go-to climate experts. He’s a staff research scientist at UCLA’s Institute of the Environment and Sustainability who spends more than 1,100 hours each year on public-facing climate and weather communication, explaining whether (often, yes) and how climate change is raising the number and exacer­bating the viciousness of weather disasters. “I’m a physical scientist, but I not only study how the physics and thermo­dynamics of weather evolve but how they affect people,” says Swain. “I lead investigations into how extreme events like floods and droughts and wildfires are changing in a warming climate, and what we might do about it.”

    He translates that science to everyday people, even as the number of weather-disaster headlines grows each year. “To be quite honest, it’s nerve-racking,” says Swain. “There’s such a demand. But there’s a climate emergency, and we need climate scientists to talk to the world about it.”

    No bells, no whistles. No fancy clothes, makeup, or vitriolic speech. Sometimes he doesn’t even shave for the camera. Just a calm, matter-of-fact voice talking about science on the radio, online, on TV. In 2023, he gave nearly 300 media interviews—sometimes at midnight or in his car. The New York Times, CNN, and BBC keep him on speed dial. Social media is Swain’s home base. His Weather West blog reaches millions. His weekly Weather West “office hours” on YouTube are public and interactive, doubling as de facto press conferences. His tweets reach 40 million people per year. “I don’t think that he appreciates fully how influential he is of the public understanding of weather events, certainly in California but increasingly around the world,” says Stanford professor of earth system science Noah Diffenbaugh, ’96, MS ’97, Swain’s doctoral adviser and mentor. “He’s such a recognizable presence in newspapers and radio and television. Daniel’s the only climate scientist I know who’s been able to do that.”

    Illustration of Daniel Swain's reflection in a puddle.

    There’s no established job description for climate communicator—what Swain calls himself—and no traditional source of funding. He’s not particularly a high-energy person, nor is he naturally gregarious; in fact, he has a chronic medical condition that often saps his energy. But his work is needed, he says. “Climate change is an increasingly big part of what’s driving weather extremes today,” Swain says. “I connect the dots between the two. There’s a lot of misunderstanding about how a warming climate affects day-to-day variations in weather, but my goal is to push public perception toward what the science actually says.” So when reporters call him, he does his best to call them back. 

    Decoration

    7:30 a.m., winds at 5 mph from the east northeast

    Swain finishes the phone call with the Times reporter and schedules a Zoom interview with Reuters for noon. Then he brushes his teeth. He’s used to a barrage of requests when there’s a catastrophic weather event. Take August 2020, when, over three days, California experienced 14,000 lightning strikes from “dry” thunderstorms. More than 650 reported wildfires followed, eventually turning the skies over San Francisco a dystopian orange. “In a matter of weeks, I did more than 100 interviews with television, radio, and newspaper outlets, and walked a social media audience of millions through the disaster unfolding in their own backyards,” he wrote in a recent essay for Nature.

    Swain’s desire to understand the physics of weather stretches back to his preschool years. In 1993, his family moved from San Francisco across the Golden Gate Bridge to San Rafael, and the 4-year-old found himself wondering where all that Bay City fog had gone. Two years later, Swain spent the first big storm of his life under his parents’ bed. He lay listening to screeching 100 mile-per-hour winds around his family’s home, perched on a ridge east of Mount Tamalpais. But he was more excited than scared. The huge winter storm of 1995 that blew northward from San Francisco and destroyed the historic Conservatory of Flowers just got 6-year-old Swain wired.

    ‘Climate change is an increasingly big part of what’s driving weather extremes today. I connect the dots between the two.’

    “To this day, it’s the strongest winds I’ve ever experienced,” he says. “It sent a wind tunnel through our house.” It broke windows. Shards of glass embedded in one of his little brother’s stuffies, which was sitting in an empty bedroom. “I remember being fascinated,” he says. So naturally, when he got a little older, he put a weather station on top of that house. And then, in high school, he launched his Weather West blog. “It was read by about 10 people,” Swain says, laughing. “I was a weather geek. It didn’t exactly make me popular.” Two decades, 550 posts, and 2 million readers later, well, who’s popular now?

    Swain graduated from UC Davis with a bachelor’s degree in atmospheric science. He knew then that something big was happening on the weather front, and he wanted to understand how climate change was influencing the daily forecast. So at Stanford, he studied earth system science and set about using physics to understand the causes of changing North Pacific climate extremes. “From the beginning, Daniel had a clear sense of wanting to show how climate change was affecting the weather conditions that matter for people,” says Diffenbaugh. “A lot of that is extreme weather.” Swain focused on the causes of persistent patterns in the atmosphere—long periods of drought or exceptionally rainy winters—and how climate change might be exacerbating them.

    The first extreme weather event he studied was the record-setting California drought that began in 2012. He caught the attention of both the media and the scientific community after he coined the term Ridiculously Resilient Ridge, referring to a persistent ridge of high pressure caused by unusual oceanic warmth in the western tropical Pacific Ocean. That ridge was blocking weather fronts from bringing rain into California. The term was initially tongue-in-cheek. Today the Ridiculously Resilient Ridge (aka RRR or Triple R) has a Wikipedia page.

    “One day, I was sitting in my car, waiting to pick up one of my kids, reading the news on my phone,” says Diffenbaugh. “And I saw this article in the Economist about the drought. It mentioned this Ridiculously Resilient Ridge. I thought, ‘Oh, wow, that’s interesting. That’s quite a branding success.’ I click on the page and there’s a picture of Daniel Swain.”

    Diffenbaugh recommended that Swain write a scientific paper about the Ridiculously Resilient Ridge, and Swain did, in 2014. By then, the phrase was all over the internet. “Journalists started calling while I was still at Stanford,” says Swain. “I gave into it initially, and the demand just kept growing from there.”

    Decoration

    11:45 a.m., precipitation 0 inches

    Swain’s long, lanky frame is seated ramrod straight in front of his computer screen, scrolling for the latest updates about Hurricane Otis. At noon, he signs in to Zoom and starts answering questions again.

    Reuters: “Hurricane Otis wasn’t in the forecast until about six to 10 hours before it occurred. What would you say were the factors that played into its fierce intensification?”

    Swain: “Tropical cyclones, or hurricanes, require a few different ingredients. I think the most unusual one was the warmth of water temperature in the Pacific Ocean off the west coast of Mexico. It’s much higher than usual. This provided a lot of extra potential intensity to this storm. We expect to see increases in intensification of storms like this in a warming climate.”

    Swain’s dog, Luna, bored by the topic, snores softly. She’s asleep just behind him, next to a bookshelf filled with weather disaster titles: The Terror by Dan Simmons; The Water Will Come by Jeff Goodell; Fire Weather by John Vaillant. And the deceptively hopeful-sounding Paradise by Lizzie Johnson, which tells the story of the 2018 Camp Fire that burned the town of Paradise, Calif., to the ground. Swain was interviewed by Johnson for the book. The day of the fire, he found himself glued to the comment section of his blog, warning anyone who asked about evacuation to get out.

    “During the Camp Fire, people were commenting, ‘I’m afraid. What should we do? Do we stay or do we go?’ Literally life or death,” he says. He wrote them back: “There is a huge fire coming toward you very fast. Leave now.” As they fled, they sent him progressively more horrifying images of burning homes and trees like huge, flaming matchsticks. “This makes me extremely uncomfortable—that I was their best bet for help,” says Swain.

    Swain doesn’t socialize much. He doesn’t have time. His world revolves around his home life, his work, and taking care of his health. He has posted online about his chronic health condition, Ehlers-Danlos syndrome, a heritable connective tissue disease that, for him, results in fatigue, gastrointestinal problems, and injuries—he can partially dislocate a wrist mopping the kitchen floor. He works to keep his health condition under control when he has down time, traveling to specialists in Utah, taking medications and supplements, and being cautious about any physical activity. When he hikes in the Colorado Rocky Mountains, he’s careful and tries to keep his wobbly ankles from giving out. Doctors don’t have a full understanding of EDS. So, Swain researches his illness himself, much like he does climate science, constantly looking for and sifting through new data, analyzing it, and sometimes sharing what he discovers online with the public. “If it’s this difficult to parse even as a professional scientist and science communicator, I can only imagine how challenging this task is for most other folks struggling with complex/chronic illnesses,” he wrote on Twitter. 

    ‘“There is a huge fire coming toward you very fast. Leave now.” This makes me extremely uncomfortable—that I was their best bet for help. ’

    It helps if he can exert some control over his own schedule to minimize fatigue. The virtual world has helped him do that. He mostly works from a small, extra bedroom in an aging rental home perched at an elevation of 5,400 feet in Boulder, where he lives with his partner, Jilmarie Stephens, a research scientist at the University of Colorado Boulder.

    When Swain was hired at UCLA in 2018, Peter Kareiva, the then director of the Institute of the Environment and Sustainability, supported a nontraditional career path that would allow Swain to split his time between research and climate communication, with the proviso that he find grants to fund much of his work. That same year, Swain was invited to join a group at the National Center for Atmospheric Research (NCAR) located in Boulder, which has two labs located at the base of the Rocky Mountains. 

    “Daniel had a very clear vision about how he wanted to contribute to science and the world, using social media and his website,” says Kareiva, a research professor at UCLA. “We will not solve climate change without a movement, and communication and social media are key to that. Most science papers are never even read. What we do as scientists only matters if it has an impact on the world. We need at least 100 more Daniels.”

    And yet financial support for this type of work is never assured. In a recent essay in Nature, Swain writes about what he says is a desperate need for more institutions to fund climate communication by scientists. “Having a foot firmly planted in both research and public-engagement worlds has been crucial,” he writes. “Even as I write this, it’s unclear whether there will be funding to extend my present role beyond the next six months.”

    Decoration

    4:00 p.m., 67 degrees F

    “Ready?” says the NBC reporter on the computer screen. “Can we just have you count to 10, please?”

    “Yep. One, two, three, four, five, six, seven, eight, nine, 10,” Swain says.

    “Walk me through in a really concise way why we saw this tropical storm, literally overnight, turn into a Category 5 hurricane, when it comes to climate change,” the reporter says.

    “So, as the Earth warms, not only does the atmosphere warm or air temperatures increase, but the oceans are warming as well. And because warm tropical oceans are hurricane fuel, the maximum potential intensity of hurricanes is set by how warm the oceans are,” Swain says.

    An hour later, Swain lets Luna out and prepares for the second half of his day: He’ll spend the next five hours on a paper for a science journal. It’s a review of research on weather whiplash in California—the phenomenon of rapid swings between extremes, such as the 2023 floods that came on the heels of a severe drought. Using atmospheric modeling, Swain predicted in a 2018 Nature Climate Change study that there would be a 25 percent to 100 percent increase in extreme dry-to-wet precipitation events in the years ahead. Recent weather events support that hypothesis, and Swain’s follow-up research analyzes the ways those events are seriously stressing California’s water storage and flood control infrastructure.

    “What’s remarkable about this summer is that the record-shattering heat has occurred not only over land but also in the oceans,” Swain explained in an interview with Katie Couric on YouTube in August, “like the hot tub [temperature] water in certain parts of the shallow coastal regions off the Gulf of Mexico.” In a warming climate, the atmosphere acts as a kitchen sponge, he explains later. It soaks up water but also wrings it out. The more rapid the evaporation, the more intense the 
    precipitation. When it rains, there are heavier downpours and more extreme flood events.

    ‘What we do as scientists only matters if it has an impact on the world. We need at least 100 more Daniels.’

    “It really comes down to thermo­dynamics,” he says. The increasing temperatures caused by greenhouse gases lead to more droughts, but they also cause more intense precipitation. The atmosphere is thirstier, so it takes more water from the land and from plants. The sponge holds more water vapor. That’s why California is experiencing these wild alternations, he says, from extremely dry to extremely wet. “It explains the role climate change plays in turning a tropical storm overnight into hurricane forces,” he says.

    Decoration

    October 26, expected high of 45 degrees F

    In 2023, things got “ludicrously crazy” for both Swain and the world. It was the hottest year in recorded history. Summer temperatures broke records worldwide. The National Oceanic and Atmospheric Administration reported 28 confirmed weather/climate disaster events with losses exceeding $1 billion—among them a drought, four flooding events, 19 severe storm events, two tropical cyclones, and a killer wildfire. Overall, catastrophic weather events resulted in the deaths of 492 people in the United States. “Next year may well be worse than that,” Swain says. “It’s mind-blowing when you think about that.” 

    “There have always been floods and wildfires, hurricanes and storms,” Swain continues. “It’s just that now, climate change plays a role in most weather disasters”—pumped-up storms, more intense and longer droughts and wildfire seasons, and heavier rains and flooding. It also plays a role in our ability to manage those disasters, Swain says. In a 2023 paper he published in Communications Earth & Environment, for example, he provides evidence that climate change is shifting the ideal timing of prescribed burns (which help mitigate wildfire risk) from spring and autumn to winter.

    The day after Hurricane Otis strikes, Swain’s schedule has calmed down, so he takes time to make the short drive from his home up to the NCAR Mesa Lab, situated in a majestic spot where the Rocky Mountains meet the plains. Sometimes he’ll sit in his Hyundai in the parking lot, looking out his windshield at the movements of the clouds while doing media interviews on his cell phone. Today he scrolls through weather news updates on the aftermath of Hurricane Otis, keeping informed for the next interview that pops up, or his next blog post. In total, 52 people will be reported dead due to the disaster. The hurricane destroyed homes and hotels, high-rises and hospitals. Swain’s name will appear in at least a dozen stories on Hurricane Otis, including one by David Wallace-Wells, an opinion writer for the New York Times, columnist for the New York Times Magazine, and bestselling author of The Uninhabitable Earth: Life After Warming. “It’s easy to get pulled into overly dramatic ways of looking at where the world is going,” says Wallace-Wells, who routinely listens to Swain’s office hours and considers him a key source when he needs information on weather events. “Daniel helps people know how we can better calibrate those fears with the use of scientific rigor. He’s incredibly valuable.”

    From the parking lot in the mountains, Swain often watches the weather that blows across the wide-open plains that stretch for hundreds of miles, all the way to the Mississippi River. He never tires of examining weather in real time, learning from it. He studies the interplay between the weather and the clouds at this spot where storms continually roll in and roll out.

    “After all these years,” he says, “I’m still a weather geek.” 


    Tracie White is a senior writer at Stanford. Email her at traciew@stanford.edu.

    Reinaldo José Lopes: Camadas do fundo de um lago retratam como presença humana transformou radicalmente a Terra (Folha de S.Paulo)

    www1.folha.uol.com.br

    Opinião

    3.dez.2023 às 23h15

    “O mundo está mudando: sinto-o na água, sinto-o na terra e farejo-o no ar.” Quem só assistiu aos filmes da série “O Senhor dos Anéis” se acostumou a ouvir essa frase na voz augusta de Cate Blanchett (a elfa Galadriel); nos livros, quem a pronuncia é o ent (gigante arvoresco) Barbárvore. Trata-se, no fundo, de um resumo da conclusão do romance de fantasia de J.R.R. Tolkien: o fim de uma era e o começo de outra, caracterizada pelo Domínio dos Homens. E se fosse possível detectar diretamente algo muito parecido com isso no nosso mundo do século 21? Algo que prove, para além de qualquer dúvida, que a nossa espécie passou a moldar a Terra de forma irreversível?

    A resposta a essa pergunta pode ser encontrada em muitos lugares, mas tudo indica que a versão mais contundente e consolidada dela, a que entrará para os livros de geologia e de história, vem do lago Crawford, no Canadá. Os cientistas encarregados de definir formalmente o início do chamado Antropoceno –a época geológica caracterizada pela intervenção humana maciça em diversos aspectos do funcionamento do planeta– estão usando o lago como o exemplo por excelência desse fenômeno.

    É por isso que convido o leitor para um mergulho naquelas águas alcalinas. Entender os detalhes que fazem do lugar um exemplo tão útil para entender o Antropoceno é, ao mesmo tempo, uma pequena aula de método científico e um retrato do poderio –frequentemente destrutivo– que desenvolvemos como espécie.

    Uma das análises mais completas da lagoa canadense foi publicado na revista científica The Anthropocene Review por uma equipe da Universidade Brock, no Canadá. A primeira coisa a se ter em mente é que o lago Crawford parece um grande funil: relativamente pequeno (2,4 hectares de área) e fundo (24 m entre a superfície e o leito). Isso faz com que as camadas d’água, embora bem oxigenadas, misturem-se pouco. Por causa da salinidade e alcalinidade elevadas, há pouca vida animal no fundo.

    E esse é o primeiro grande pulo do gato: tais características fazem com que camadas muito estáveis de sedimento possam se depositar anualmente no leito do lago Crawford. Todo ano é a mesma história: durante o outono, uma lâmina mais escura de matéria orgânica desce ao fundo (como estamos no Canadá, muitas árvores perdem as folhas nessa época); no verão, essa camada é recoberta por outra, mais clara, de minerais ricos em cálcio. Essa regularidade nunca é bagunçada pela chamada bioturbação (invertebrados aquáticos cavando o leito, por exemplo).

    Ou seja, o fundo do lago é um reloginho, ou melhor, um calendário. Cilindros de sedimento tirados de seu fundo podem ser datados ano a ano com pouquíssima incerteza.

    Isso significa que dá para identificar com precisão o aparecimento do elemento químico plutônio –resultado direto do uso de armas nucleares, principalmente em testes militares– a partir de 1948, com um pico em 1967 e uma queda nos anos 1980. Dada a natureza dos elementos radioativos, essa assinatura estará lá rigorosamente “para sempre” (ao menos do ponto de vista humano).

    Algo muito parecido vale para as chamadas SCPs (partículas esferoidais carbonáceas, na sigla inglesa). Elas são produzidas pela queima industrial, em altas temperaturas, de carvão mineral e derivados do petróleo. Começam a aparecer nos sedimentos da segunda metade do século 19, mas sua presença só dispara mesmo, de novo, no começo dos anos 1950. Nada que não seja a ação humana poderia produzir esse fenômeno.

    É por isso que os cientistas estão propondo o ano de 1950 como o início do Antropoceno. Ainda que a proposta não “pegue” nesse formato exato, o peso de evidências como as camadas do lago Crawford é dificílimo de contrariar. Está na água, na terra e no ar. E, para o bem ou para o mal, a responsabilidade é nossa.

    Consciousness theory slammed as ‘pseudoscience’ — sparking uproar (Nature)

    nature.com

    Researchers publicly call out theory that they say is not well supported by science, but that gets undue attention.

    Mariana Lenharo

    20 September 2023


    Scanning electron micrograph of human brain cells.
    Some research has focused on how neurons (shown here in a false-colour scanning electron micrograph) are involved in consciousness.Credit: Ted Kinsman/Science Photo Library

    A letter, signed by 124 scholars and posted online last week1, has caused an uproar in the consciousness research community. It claims that a prominent theory describing what makes someone or something conscious — called the integrated information theory (IIT) — should be labelled “pseudoscience”. Since its publication on 15 September in the preprint repository PsyArXiv, the letter has some researchers arguing over the label and others worried it will increase polarization in a field that has grappled with issues of credibility in the past.Decades-long bet on consciousness ends — and it’s philosopher 1, neuroscientist 0

    “I think it’s inflammatory to describe IIT as pseudoscience,” says neuroscientist Anil Seth, director of the Centre for Consciousness Science at the University of Sussex near Brighton, UK, adding that he disagrees with the label. “IIT is a theory, of course, and therefore may be empirically wrong,” says neuroscientist Christof Koch, a meritorious investigator at the Allen Institute for Brain Science in Seattle, Washington, and a proponent of the theory. But he says that it makes its assumptions — for example, that consciousness has a physical basis and can be mathematically measured — very clear.

    There are dozens of theories that seek to understand consciousness — everything that a human or non-human experiences, including what they feel, see and hear — as well as its underlying neural foundations. IIT has often been described as one of the central theories, alongside others, such as global neuronal workspace theory (GNW), higher-order thought theory and recurrent processing theory. It proposes that consciousness emerges from the way information is processed within a ‘system’ (for instance, networks of neurons or computer circuits), and that systems that are more interconnected, or integrated, have higher levels of consciousness.

    A growing discomfort

    Hakwan Lau, a neuroscientist at Riken Center for Brain Science in Wako, Japan, and one of the authors of the letter, says that some researchers in the consciousness field are uncomfortable with what they perceive as a discrepancy between IIT’s scientific merit and the considerable attention it receives from the popular media because of how it is promoted by advocates. “Has IIT become a leading theory because of academic acceptance first, or is it because of the popular noise that kind of forced the academics to give it acknowledgement?”, Lau asks.If AI becomes conscious: here’s how researchers will know

    Negative feelings towards the theory intensified after it captured headlines in June. Media outlets, including Nature, reported the results of an ‘adversarial’ study that pitted IIT and GNW against one another. The experiments, which included brain scans, didn’t prove or completely disprove either theory, but some researchers found it problematic that IIT was highlighted as a leading theory of consciousness, prompting Lau and his co-authors to draft their letter.

    But why label IIT as pseudoscience? Although the letter doesn’t clearly define pseudoscience, Lau notes that a “commonsensical definition” is that pseudoscience refers to “something that is not very scientifically supported, that masquerades as if it is already very scientifically established”. In this sense, he thinks that IIT fits the bill.

    Is it testable?

    Additionally, Lau says, some of his co-authors think that it’s not possible to empirically test IIT’s core assumptions, which they argue contributes to the theory’s status as pseudoscience.Decoding the neuroscience of consciousness

    Seth, who is not a proponent of IIT, although he has worked on related ideas in the past, disagrees. “The core claims are harder to test than other theories because it’s a more ambitious theory,” he says. But there are some predictions stemming from the theory, about neural activity associated with consciousness, for instance, that can be tested, he adds. A 2022 review found 101 empirical studies involving IIT2.

    Liad Mudrik, a neuroscientist at Tel Aviv University, in Israel, who co-led the adversarial study of IIT versus GNW, also defends IIT’s testability at the neural level. “Not only did we test it, we managed to falsify one of its predictions,” she says. “I think many people in the field don’t like IIT, and this is completely fine. Yet it is not clear to me what is the basis for claiming that it is not one of the leading theories.”

    The same criticism about a lack of meaningful empirical tests could be made about other theories of consciousness, says Erik Hoel, a neuroscientist and writer who lives on Cape Cod, in Massachusetts, and who is a former student of Giulio Tononi, a neuroscientist at the University of Wisconsin-Madison who is a proponent of IIT. “Everyone who works in the field has to acknowledge that we don’t have perfect brain scans,” he says. “And yet, somehow, IIT is singled out in the letter as this being a problem that’s unique to it.”

    Damaging effect

    Lau says he doesn’t expect a consensus on the topic. “But I think if it is known that, let’s say, a significant minority of us are willing to [sign our names] that we think it is pseudoscience, knowing that some people may disagree, that’s still a good message.” He hopes that the letter reaches young researchers, policymakers, journal editors and funders. “All of them right now are very easily swayed by the media narrative.”

    Mudrik, who emphasizes that she deeply respects the people who signed the letter, some of whom are close collaborators and friends, says that she worries about the effect it will have on the way the consciousness field is perceived. “Consciousness research has been struggling with scepticism from its inception, trying to establish itself as a legitimate scientific field,” she says. “In my opinion, the way to fight such scepticism is by conducting excellent and rigorous research”, rather than by publicly calling out certain people and ideas.

    Hoel fears that the letter might discourage the development of other ambitious theories. “The most important thing for me is that we don’t make our hypotheses small and banal in order to avoid being tarred with the pseudoscience label.”

    Não espere que uma ‘teoria de tudo’ explique tudo (Folha de S.Paulo)

    www1.folha.uol.com.br

    Dennis Overbye

    14 de setembro de 2023

    Nem mesmo a física mais avançada pode revelar tudo o que queremos saber sobre a história e o futuro do cosmos, ou sobre nós mesmos


    Para que servem as leis da física, se não podemos resolver as equações que as descrevem?

    Essa foi a pergunta que me ocorreu ao ler um artigo no The Guardian escrito por Andrew Pontzen, um cosmólogo do University College London que passa os dias realizando simulações computacionais de buracos negros, estrelas, galáxias e do nascimento e crescimento do universo. O que ele queria dizer era que ele e todos nós estamos fadados ao fracasso.

    “Mesmo que imaginemos que a humanidade acabará descobrindo uma ‘teoria de tudo’ que abrange todas as partículas e forças individuais, o valor explicativo dessa teoria para o universo como um todo será provavelmente marginal”, escreveu Pontzen.

    Não importa o quanto pensemos conhecer as leis básicas da física e a lista cada vez maior de partículas elementares, não há poder computacional suficiente no universo para acompanhar todas elas. E nunca poderemos saber o bastante para prever com segurança o que acontece quando todas essas partículas colidem ou interagem de outra forma. Um ponto decimal adicionado a uma estimativa da localização ou velocidade de uma partícula, digamos, pode repercutir ao longo da história e alterar o resultado bilhões de anos depois, por meio do chamado “efeito borboleta” da teoria do caos.

    Considere algo tão simples quanto, por exemplo, a órbita da Terra em torno do sol, diz Pontzen. Deixado à sua própria conta, nosso mundo, ou seu fóssil crocante, continuaria para sempre na mesma órbita. Mas na amplidão do tempo cósmico os empurrões gravitacionais de outros planetas do sistema solar podem alterar seu curso. Dependendo da precisão com que caracterizamos esses empurrões e do material que está sendo empurrado, os cálculos gravitacionais podem produzir previsões extremamente divergentes sobre onde a Terra e seus irmãos estarão daqui a centenas de milhões de anos.

    Como resultado, na prática, não podemos prever o futuro nem o passado. Cosmólogos como Pontzen podem proteger suas apostas diminuindo o zoom e considerando o panorama geral —grandes aglomerações de materiais, como nuvens de gás, ou sistemas cujo comportamento coletivo é previsível e não depende de variações individuais. Podemos ferver macarrão sem monitorar cada molécula de água.

    Mas existe o risco de se presumir muita ordem. Veja um formigueiro, sugere Pontzen. Os movimentos de qualquer formiga parecem aleatórios. Mas se você olhar o todo, o formigueiro parece fervilhar com propósito e organização. É tentador ver uma consciência coletiva em ação, escreve Pontzen, mas “são apenas formigas solitárias” que seguem regras simples. “A sofisticação emerge do grande número de indivíduos que seguem essas regras”, observa ele, citando o físico Philip W. Anderson, de Princeton: “Mais é diferente”.

    Na cosmologia, formou-se uma explicação plausível da história do universo através de suposições simples sobre coisas sobre as quais nada sabemos —matéria escura e energia escura—, mas que, no entanto, constituem 95% do universo. Supostamente, esse “lado negro” do universo interage com 5% da matéria conhecida —átomos— apenas através da gravidade. Depois do Big Bang, conta a história, formaram-se poças de matéria escura, que puxaram a matéria atômica, que se condensou em nuvens, que se aqueceram e se transformaram em estrelas e galáxias. À medida que o universo se expandiu, a energia escura que o permeia também se expandiu e começou a afastar as galáxias cada vez mais rapidamente.

    Mas essa narrativa falha logo no início, nas primeiras centenas de milhões de anos, quando estrelas, galáxias e buracos negros se formavam num processo confuso e pouco compreendido que os investigadores chamam de “gastrofísica”.

    Sua mecânica é espantosamente difícil de prever, envolvendo campos magnéticos, a natureza e composição das primeiras estrelas e outros efeitos desconhecidos. “Certamente ninguém pode fazer isso agora, partindo simplesmente das leis confiáveis da física, independentemente da quantidade de potência de computação oferecida”, disse Pontzen por e-mail.

    Dados recentes do Telescópio Espacial James Webb, revelando galáxias e buracos negros que parecem demasiado maciços e demasiado precoces no universo para serem explicados pelo “modelo padrão” da cosmologia, parecem ampliar o problema. Isso é suficiente para fazer os cosmólogos voltarem às suas pranchetas?

    Pontzen não está convencido de que chegou a hora de os cosmólogos abandonarem seu modelo de universo duramente conquistado. A história cósmica é complexa demais para ser simulada em detalhes. Só o nosso sol, salienta ele, contém 1057 átomos, e existem trilhões e trilhões dessas estrelas por aí.

    Há meio século, astrônomos descobriram que o universo, com suas estrelas e galáxias, estava repleto de radiação de micro-ondas que sobrou do Big Bang. O mapeamento dessa radiação permitiu que eles criassem uma imagem do cosmos bebê, como existia apenas 380 mil anos após o início dos tempos.

    Em princípio, toda a história poderia estar incorporada ali nos caracóis sutis da energia primordial. Na prática, é impossível ler o desdobrar do tempo nessas micro-ondas suficientemente bem para discernir a ascensão e a queda dos dinossauros, o alvorecer da era atômica ou o aparecimento de um ponto de interrogação no céu bilhões de anos mais tarde. Quase 14 bilhões de anos de incerteza quântica, acidentes e detritos cósmicos permanecem entre então e agora.

    Na última contagem, os físicos identificaram cerca de 17 tipos de partículas elementares que constituem o universo físico e pelo menos quatro formas de interação —através da gravidade, do eletromagnetismo e das chamadas forças nucleares fortes e fracas.

    A aposta cósmica que a ciência ocidental empreendeu é mostrar que essas quatro forças, e talvez outras ainda não descobertas, agindo sobre um vasto conjunto de átomos e seus constituintes, são suficientes para explicar as estrelas, o arco-íris, as flores, nós mesmos e, de fato, a existência do universo como um todo. É uma enorme montanha intelectual e filosófica para escalar.

    Na verdade, apesar de toda a nossa fé no materialismo, diz Pontzen, talvez nunca saibamos se tivemos sucesso. “Nossas origens estão escritas no céu”, disse ele, “e estamos apenas aprendendo a lê-las.”

    Tradução de Luiz Roberto M. Gonçalves

    The Mystery Genes That Are Keeping You Alive (Wired)

    Nobody knows what around a fifth of your genes actually do. It’s hoped they could hold the secret to fixing developmental disorders, cancer, neurodegeneration, and more.

    Original article

    dna molecule illustration

    Roger Highfield – Aug 8, 2023 2:00 PM

    One could be forgiven for a little genetic déjà vu.

    Launched in 1990, the Human Genome Project unveiled its first readout of the human DNA sequence with great fanfare in 2000. The human genome was declared essentially complete in 2003—but it took nearly 20 more years before the final, complete version was released.

    This did not mark the end of humankind’s genetic puzzle, however. A new study has mapped the yawning gap between reading our genes and understanding them. Vast parts of the genome—areas the study authors have nicknamed the “Unknome”—are made of genes whose function we still don’t know.

    This has important implications for medicine: Genes are the instructions for making the protein building blocks of the body. Plenty of those still shrouded in darkness could have profound medical significance and may hold the keys to disorders of development, cancer, neurodegeneration, and more.

    The study makes it embarrassingly clear just how many important genes we know little to nothing about. It estimates that a fifth of human genes with a vital function are still essentially a mystery. The good news is that the research also outlines how scientists can focus on those mystery genes. “We might now be at the beginning of the end of the Unknome,” says Matthew Freeman of the Dunn School of Pathology at the University of Oxford, a coauthor of the study.

    The research team used two tools to find the gaps in our knowledge. First, using the plethora of existing databases of genetic information, they compared the genetic codes of many different species to reveal genes that look roughly similar.

    These riffs on a genetic theme are known as conserved genes, and even if we don’t understand what they do, we know that they must be important because nature is parsimonious and tends to use the same genetic machinery to do important jobs in different organisms. “The one thing we could be confident of is that, if important, these genes would be quite well-conserved across evolution,” says Freeman.

    Once they had found similar genetic riffs in worms, humans, flies, bacteria, and other organisms, the researchers could look at what was known about the function of these clearly important genes and score them accordingly, with a high “knownness” score reflecting solid understanding.

    Because so much genetic information is already available on hundreds of genomes and recorded in a standardized way, it was possible to automate this scoring process. “We then asked how many of those [conserved genes] have a score of less than one, where essentially nothing is known about them,” says Freeman. “To our surprise, two decades after the first human genome, it is still an extraordinary number.”

    In all, the total number of human genes with a knownness score of 1 or less is currently 1,723 out of 19,664.

    By the same token, the top 10 genes identified by the team’s rummage through genetic databases corresponded with “all the most famous genes, which is reassuring,” says Sean Munro of the Laboratory of Molecular Biology in Cambridge, a study coauthor. “We recognized every single one of them, and there are already thousands of papers about each of them.”

    When it came to the substantial number that were unknown, the team conducted one more study, using the best understood (at the genetic level) organism of all: Drosophila melanogaster. These fruit flies have been the subject of research for more than a century because they are easy and inexpensive to breed, have a short life cycle, produce lots of young, and can be genetically modified in numerous ways.

    The team used gene editing to dial down the use of around 300 low-scoring genes found in both humans and fruit flies. “We found that one-quarter of these unknown genes were lethal—when knocked out, they caused the flies to die, and yet nobody had ever known anything about them,” says Freeman. “Another 25 percent of them caused changes in the flies—phenotypes—that we could detect in many ways.” These genes were linked with fertility, development, locomotion, protein quality control, and resilience to stress. “That so many fundamental genes are not understood was eye-opening,” Freeman says. It’s possible that variation in these genes could have very big impacts on human health.

    All of this “unknomics” information is held on a database, which the team is making available for other researchers to use to discover new biology. The next step may be to hand the data on these mystery genes and the mystery proteins they create over to AI.

    DeepMind’s AlphaFold, for example, can provide important insights into what mystery proteins do, notably by revealing how they interact with other proteins, says Alex Bateman of the European Bioinformatics Institute, based near Cambridge, UK. So can cryo-EM, which is a way of producing images of large, complex molecules, he says. And a University College London team has shown a systematic way to use machine learning to figure out what proteins do in yeast.

    The Unknome is unusual in that it’s a biology database that will shrink as we understand it better. The paper shows that over the past decade “we have moved from 40 percent to 20 percent of the human proteome having a certain level of unknownness,” says Bateman. However, at current progress rates, working out the function of all human protein-coding genes could take more than half a century, Freeman estimates.

    The discovery that so many genes remain misunderstood reflects what is called the streetlight effect, or the drunkard’s search principle, an observational bias that occurs when people only search for something where it is easiest to look. In this case, it has caused what Freeman and Munro call a “bias in biological research toward the previously studied.”

    The same goes for researchers, who tend to get funding for research in relatively well-understood areas, rather than going off into what Freeman calls the wilderness. This is why the database is so important, Munro explains—it fights back against the economics of academia, which avoids things that are very poorly understood. “There is a need for a different type of support to address these unknowns,” says Munro.

    But even with the database becoming available and researchers picking through it, there will still be some knowledge blind spots. The study focused on genes that are responsible for proteins. Over the past two decades, uncharted areas of the genome have also been found to harbor the code for small RNAs—scraps of genetic material that can affect other genes, and which are critical regulators of normal development and bodily functions. There may be more “unknown unknowns” lurking in the human genome.

    For now, there’s still plenty to get into, and Freeman hopes this work will encourage others to study the genetic Terra Incognita: “There’s more than enough Unknome for anyone who wants to explore genuinely new biology.”

    Amid Indian Nationalism, Pseudoscience Seeps Into Academia (Undark)

    Scientists and students participate in the 2019 March for Science at Rajabazar Science College, Kolkata, West Bengal, India. Visual: Avishek Das/SOPA Images/LightRocket via Getty Images

    Republish

    In recent years, falsehoods have spread to institutions, where the next generation of scientists are being educated.

    By Arbab Ali & Nadeem Sarwar

    07.26.2023

    In Oct. 2022, India’s Ministry of Science and Technology, in collaboration with other ministries and departments, announced that it would host a four-day conference called “Akash For Life” at a university in the northern Indian city of Dehradun.

    “Akash” translates to “sky” or “spirit” in Hindi, and refers to one of five universal elements according to Hinduism. The event, according to its organizers, would integrate such traditional concepts into an academic sphere, and seek to “educate the youth of India to the wisdom of ancient science along with modern scientific advancements.”

    But no sooner than the event was announced, it stirred furor in the Indian scientific community.

    Related: The Threat of Pseudoscience in India

    In a statement issued later that month, the Karnataka chapter of the nonprofit India March for Science wrote, “We reject the concept of Panchabhootas” — referring to the Hindu concept of the five elements. “The sky, earth, water are not elements. Such concepts have been deleted from science books a long time back.”

    The West Bengal chapter was similarly clear in its disapproval: “Any attempt to belittle or trivialize humanity’s quest for knowledge through the scientific method has to be debunked and thoroughly rejected.”

    The Ministry of Science and Technology did not respond to multiple requests for comment from Undark.

    The “Akash” conference was just one of the latest events in India to face charges of pseudoscience as academics grow concerned about the country’s rise of conspiracies and falsehoods. Journalist Ruchi Kumar reported on this phenomenon for Undark in 2018, but experts say such discourse has only picked up in pace — and increasingly spread to institutions, where the next generation of scientists are being educated.

    Aniket Sule, an associate professor at the Tata Institute of Fundamental Research in Mumbai, noted that while fringe voices can be few and far between, they are still given prominence at conferences and meetings, which paints a wrong picture for the entire faculty.

    “Now, what has happened is that these fringe right-wing sympathizers have been given prominence,” said Sule. “Even if, for example, out of a hundred people, if there is one right-wing sympathizer, then that one person would be called to all events.”


    Many experts have tied the rise of pseudoscience in India to the Bharatiya Janata Party, a right-wing political party that came to power in 2014, when Prime Minister Narendra Modi was elected. Members of the party have repeatedly amplified scientific falsehoods — for instance, that cow urine can cure cancer, or that ancient Indians invented the internet.

    “It is clear that the government is propagating this sort of pseudoscience,” said Soumitro Banerjee, an engineering professor at the Indian Institute of Science Education and Research in Kolkata.

    Such claims often tout the superiority of traditional knowledge over modern science and cite ancient Hindu texts as evidence. In recent years, they have leapt over to academic circles.

    A screenshot of the audience in attendance at “Akash For Life” in the fall of 2022. Visual: Uttaranchal University/YouTube

    In 2019, for example, G. Nageswara Rao, then vice chancellor of Andhra University, said that the Kauravas — who appear in the Hindu epic Mahabharata — were born of “stem cell and test tube technology.

    More recently, news came out that Laxmidhar Behera, director of the Indian Institute of Technology Mandi, once claimed to have performed an exorcism with holy chants. When asked about the experience, Behera later told the newspaper The Indian Express, “Ghosts exist, yes.”

    Scientific falsehoods have not only been espoused by academics, but have also made their way into course teachings.

    In 2020, the Indian Institute of Technology Indore introduced a class to impart mathematical and scientific knowledge from ancient texts in the Sanskrit language. And in February of this year, IIT Kanpur — one of the country’s most elite universities — invited Rajiv Malhotra for a guest lecture. In the past, Malhotra cited an satirical article in denying the Greek civilization’s existence and touted the spiritual concept of the “third eye” as a substitute for medical diagnosis.

    The same month, a group of scientists and researchers criticized the National Commission for Indian System of Medicine — the regulatory body governing public medical institutions’ policies — for introducing medical astrology as an elective in the Bachelor of Ayurvedic Medicine and Surgery program, which is offered at hundreds of institutions in India. The course material offers remedies in the form of mantras, amulets with protective powers, rituals, and counseling based on astrological calculations.

    Aniket Sule noted that while fringe voices can be few and far between, they are still given prominence at conferences and meetings.

    Ayurveda is a traditional system of Indian medicine that takes a natural approach to healing. Practitioners believe that diseases happen due to an imbalance in a person’s consciousness, and therefore, rely on a healing system that involves herbs, exercises, and meditation.

    But Ayurveda is a topic of contention, and its claims can be at odds with modern medical science. Cyriac Abby Philips, an Indian liver doctor based in Kerala who regularly debunks pseudoscientific claims on social media, said the alternative Ayurvedic medical system is based on pseudoscientific principles.

    Ayurveda has no basis in science, “but the whole aspect is that it has deep links to culture, tradition, and religion in India,” Philips told Undark. Yet, he said, the government is promoting Ayurvedic practices. A few years ago, for example, the National Health Mission, a government program that aims to improve access to health care, introduced a bridge course — designed to help students transition from one academic level to another — to allow Ayurveda doctors to prescribe treatments based on western medical sciences despite never studying it as part of their degree course. The move, according to the government, was to address the lack of doctors in rural areas, but the president of the Indian Medical Association has said there is no shortage. While the bridge course was ultimately dropped, some states have allowed Ayurveda doctors to prescribe and dispense medicines.

    The National Health Mission did not respond to multiple requests for comment from Undark.

    Meanwhile, the University Grants Commission, the statutory body responsible for maintaining the country’s higher education standards, asked all universities in India to “encourage” their students to take the Kamdhenu Gau Vigyan Prachar-Prasar Examination, a national-level test on “gau vigyan” or “cow science” — referring to research on the animal, which is considered sacred in Hinduism. The syllabus for the exam made claims including that earthquakes happen due to cow slaughter, and that cow byproducts are capable of curing a whole host of diseases.

    The University Grants Commission did not respond to multiple requests for comment from Undark.

    “It is clear that the government is propagating this sort of pseudoscience,” said Soumitra Banarjee.

    In India, higher education institutions are intricately tied to the national government.“Save for a few exceptions, almost every single academic institution is reliant heavily on government funding,” said Mohammad Nadeem, an assistant professor in the Department of Computer Science at Aligarh Muslim University.

    Nadeem said that, while he believes it’s important to take pride in Indian culture and heritage, glorifying its past with false claims does not serve anyone.

    Natesan Yogesh, an assistant professor of physics at the National Institute of Technology Calicut, noted that many professors at these prestigious universities believe in superstitions, but “it is not just a single faculty is approving and they come up with certain ideas. From the top itself, they are asking for proposals.”


    In April, the exclusion of Charles Darwin’s theory of biological evolution from high school textbooks became national news in India. More than 1,800 scientists, educators, and community members signed a letter condemning the move, calling it a “travesty of education.”

    But while some students and academics have been vocal in speaking out against the rise of pseudoscience and Hindu nationalism, experts noted that many are quiet, whether it be out of fear of retaliation — including denying funding and promotional opportunities — or simple opportunism.

    According to Banerjee, higher-ups at Indian scientific institutes have tried to stymie anti-pseudoscience protests since they are nearing retirement. “These people have aspirations or ambitions of being vice chancellors somewhere,” Banerjee added.

    “In India, save for a few exceptions, almost every single academic institution is reliant heavily on government funding,” said Mohammad Nadeem.

    In an email to Undark, G.L. Krishna, an Ayurvedic physician and a visiting scholar at the Indian Institute of Science in Bengaluru, wrote that dissenting voices are often “unnecessarily scared.” But according to Sule, the professor at the Tata Institute of Fundamental Research, even though those who actually believe in pseudoscience are a minority, such public statements can impact careers.

    In universities and institutions “where promotions are in the hands of top authorities, there this political favoritism is happening a lot,” said Sule. He, along with other faculty members interviewed by Undark, said that political affiliations dictate progress in academic careers, so people often choose to stay silent.

    Indeed, many heads of educational institutions in India have been vocal supporters of or involved in the national government. Santishree Dhulipudi Pandit, for example, was named the vice chancellor of Jawaharlal Nehru University early last year, and has voiced support for the ruling BJP party as well as called for “China-style” persecution of left-leaning voices. Rupinder Tewari, a previous candidate for the vice-chancellor post in Panjab University, alleged that only BJP-affiliated candidates were called in for the interview.

    The Panjab University did not respond to multiple requests for comment from Undark.

      Some academics wonder what effect the pseudoscientific trend might have on India’s reputation among the international scientific community. “But in the long run, it’s these pseudoscience peddlers who are being watched and earning the ire of the international academia and science diaspora,” said Sule.

      Still, dissenting voices such as Banerjee and Krishna are hopeful that more people will speak out, and that scientific methods will take precedence in Indian academic spheres.

      “Reality-based thinking as opposed to belief-based thinking must carry weight,” wrote Krishna. “That’s the only way.”


      Arbab Ali and Nadeem Sarwar are independent reporters based in Delhi, India.