Arquivo da tag: Meteorologia

Cobrada pelo mau tempo, Fundação Cacique Cobra Coral diz que não falhou (O Globo)

Representantes da organização garantem que não houve falhas em sua operação

Apesar da operação da Fundação Cacique Cobra Coral os jogos do Rio tem registrado mau tempo, com provas tendo que ser adiadas Foto: Jorge William / Agência O Globo

Apesar da operação da Fundação Cacique Cobra Coral os jogos do Rio tem registrado mau tempo, com provas tendo que ser adiadas – Jorge William / Agência O Globo

POR LUIZ ERNESTO MAGALHAES

10/08/2016 15:53 / atualizado 10/08/2016 16:17

RIO – Regatas na Lagoa adiadas, sessões de tênis remarcadas, transtornos provocados por ressacas que invadem instalações na Praia de Copacabana… Credenciados pelo Comitê Organizador Rio-2016 para acompanhar as condições climáticas durante a Olimpíada, os integrantes da Fundação Cacique Cobra Coral, que garantem ter poder sobrenatural para controlar o tempo, afirmam que não houve falhas na operação espiritual para garantir o sucesso da Olimpíada.

A médium Adelaide Scritori, que afirma incorporar o espírito do Cacique Cobra Coral, já circulou várias vezes pelo Parque Olímpico. O porta-voz da fundação, Osmar Santos, garante que o desempenho até agora da entidade é digno de medalha de ouro. Segundo ele, as prioridades foram direcionar o tempo para garantir a cerimônia de abertura sem chuvas e que os ventos soprassem de forma a a garantir que as regatas da Baía de Guanabara ocorressem em raias sem lixo:

Segundo Osmar, no domingo, quando uma forte ventania atingiu a cidade causando estragos e adiando provas do remo, Adelaide sequer estava no Rio. A médium, segundo ele, estaria na Região Serrana, encerrando a operação da Cerimônia de Abertura. O porta voz da médium argumenta que as demandas espirtuais da entidade são inúmeras e não se limitam a Olimpíada

– O grande legado nosso da cerimônia de abertura foi o desvio da Frente Fria que estava no Rio e foi desviada para Minas erais onde despejou 30 milímetros de chuva em pleno agosto no Vale do Jequitinonha. Isso para o cacique é muito mais importante. Agora vamos abrir um corredor para as frentes entrarem pelo continente e apagarem as queimadas no Pantanal – disse Osmar.

De acordo com Osmar, o mau tempo de hoje está relacionado com o atraso na entrada da frente fria na cidade para garantir a limpeza da Baía

Essa não é a primeira vez que a Fundação atua numa Olimpíada. Repórteres do GLOBO encontraram integrantes da Fundação em Londres, em 2012, credenciados inclusive para uma visita da presidente afastada Dilma Roousseff durante um evento oficial do Comitê Olímpico do Brasil. Adelaide também estava em Copenhague (Dinamarca) em 2009 quando o Rio foi eleito cidade sede da Olimpíada de 2016.

A Fundação também, é chamada para outros eventos como o Réveillon e o Rock in Rio. Nas últimas edições das Olimpíadas, no entanto, chegou a chover forte alguns dias. A Fundação Cacique Cobra Coral nas duas ocasiões alegou que ficou retida antes de chegar à Cidade do Rock por problemas no credenciamento do carro que transportava os integrantes.

Leia mais sobre esse assunto em  http://oglobo.globo.com/rio/cobrada-pelo-mau-tempo-fundacao-cacique-cobra-coral-diz-que-nao-falhou-19894579#ixzz4H3PcWa6C
© 1996 – 2016. Todos direitos reservados a Infoglobo Comunicação e Participações S.A. Este material não pode ser publicado, transmitido por broadcast, reescrito ou redistribuído sem autorização.

Cemaden completa cinco anos com o monitoramento de 957 municípios do País (MCTI)

JC, 5449, 1 de julho de 2016

Mais de R$ 72 milhões já foram investidos em radares meteorológicos de alta resolução para monitoramento de áreas de risco. Além disso, o Cemaden desenvolve tecnologia para previsão de quebra de safra no semiárido

O Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden) completa cinco anos com o monitoramento de áreas de risco em 957 municípios do País. Um trabalho feito 24 horas por dia, sem interrupção. As informações fornecidas por radares meteorológicos de alta tecnologia já permitiram a emissão de 5,5 mil alertas para a Defesa Civil. Segundo o diretor do Cemaden, Osvaldo Moraes, 12 novos equipamentos serão adquiridos, o que vai aumentar a confiabilidade dos alertas.

“Os radares serão instalados nas áreas que atualmente não estão cobertas, o que fará com que o Brasil possua um dos mais avançados sistemas de monitoramento de risco de ocorrência de desastres naturais do mundo”, afirma Moraes.

Vinculado ao Ministério da Ciência, Tecnologia, Inovações e Comunicações, o Cemaden investiu R$ 72 milhões em nove radares meteorológicos, equipamentos no estado da arte para aplicações voltadas ao monitoramento de desastres naturais instalados em Petrolina (PE), Natal (RN), Maceió (AL), Salvador (BA), Almenara (MG), Três Marias (MG), São Francisco (MG), Santa Teresa (ES) e Jaraguari (MS). Com eles, o Cemaden consegue mensurar a quantidade de chuvas, fazer uma previsão de tempo de curtíssimo prazo e antecipar a emissão de alertas para municípios com risco de desastres naturais. Os dados estão disponíveis em tempo real para acesso público.

“Esses radares possuem a capacidade de identificar e localizar as nuvens presentes dentro da área de cobertura do equipamento, além de medir a quantidade de chuva em um determinado local. O uso permite que os operadores do Cemaden identifiquem os locais onde as condições meteorológicas aumentam o risco de desastres naturais. Com isso, podemos emitir alertas antecipados sobre o risco de deslizamentos de terra e enchentes, por exemplo, preservando as vidas das pessoas expostas ao risco”, explica o coordenador do projeto Radares Meteorológicos do Cemaden, Carlos Frederico de Angelis.

Ele ressalta que a unidade implantada em Maceió, por exemplo, possui tecnologia e sensibilidade para levantar dados elaborados sobre chuvas num raio de 400 quilômetros de distância. Segundo Angelis, o equipamento é capaz de antecipar os riscos de desastres provocados por fenômenos meteorológicos com até seis horas de antecedência.

“Um desastre natural afeta não só a vida das pessoas que estão em áreas de risco como também a infraestrutura e a economia, como o agronegócio, por exemplo. Os impactos são ainda maiores para os pequenos produtores”, diz Angelis.

Ciência Cidadã

A quebra de safras agrícolas também está no radar do Cemaden, que vai lançar, no segundo semestre, o Sistema de Previsão de Riscos de Colapso de Safras no Semiárido Brasileiro com previsões de risco de colapso de safras geradas a partir de estimativas de modelos agrometeorológicos; dados públicos de safras típicas (milho, mandioca, feijão, arroz e sorgo) por região; e previsão do tempo para um período de 15 a 45 dias.

“A implantação deste projeto contempla o uso de modelos agrometeorológicos integrados a uma rede de monitoramento de dados (meteorológicos, fenológicos, práticas de manejo e informações do solo), contribuindo para a geração de indicadores para o monitoramento da seca, previsão e manejo dos riscos de colapso de safras e aprimoramento dos sistemas de alerta”, explica o diretor do Cemaden.

Além disso, o sistema vai contar com as informações enviadas pelos próprios produtores por meio de aplicativo desenvolvido em parceria com o Instituto Nacional de Ciência e Tecnologia para Mudanças Climáticas (INCT) do MCTIC e o International Institute for Applied Systems Analysis (Iiasa), da Áustria.

Disponibilizado para aparelhos celulares desde o início do ano, o AgriSupport permite o registro fotográfico georreferenciado de áreas plantadas e a coleta de informações sobre plantios realizados pelos pequenos produtores do semiárido.

No futuro, o monitoramento das safras será estendido para outros cultivos em outras regiões do País.

 MCTIC

As nuvens marcam as fronteiras dos ecossistemas (El País)

Padrões de nebulosidade desenham o mapa das paisagens bioclimáticas e a distribuição das espécies

MIGUEL ÁNGEL CRIADO

Clique na imagem para ver vídeo.

 

O geógrafo Adam Wilson e o ecologista Walter Jetz observaram as nuvens para saber a vida que existe sob elas. Os dois cientistas usaram imagens de satélites tiradas duas vezes ao dia durante os últimos 15 anos para criar um atlas das nuvens e relacionaram esse mapa com a biodiversidade do planeta, desenhando desde os limites dos grandes biomas (paisagens bioclimáticas) até a distribuição geográfica das diferentes espécies.

Suspensas lá em cima, as nuvens são um elemento fundamental da climatologia. Sua presença anuncia umidade, chuvas, água para as plantas, bosques e florestas, explosão de vida… Por outro lado, sua ausência caracteriza paisagens mais secas e desoladas, seja nos desertos ou no interior da Antártida. Foi essa conexão entre clima e biodiversidade que levou Wilson, professor da Universidade de Buffalo, e Jetz, pesquisador de Yale (ambas nos EUA), a buscar uma forma de detectar os padrões e dinâmica globais das nuvens mais eficiente do que os sistemas atuais.

Encontraram a solução nas fotografias da Terra tiradas há anos pela NASA. Concretamente, eles usaram os dados acumulados pela missão MODIS, siglas do espectroradiômetro de imagens de resolução média, um instrumento científico que vai a bordo de dois satélites chamados Terra e Aqua. O primeiro foi colocado em órbita em 1999, o segundo, quatro anos depois. Os dois circundam o planeta em uma órbita de polo a polo tirando fotografias sincronizadas para que Terra sobrevoe o equador de manhã e Aqua o faça pela tarde em sentido oposto. A cada dois dias fotografam todo o planeta em alta resolução.

As regiões equatoriais são as de maior concentração anual de nuvens e menor variação mensal

Com esse alcance global e uma resolução de até menos de um quilômetro, os dois pesquisadores criaram seu atlas das nuvens. Em sua versão online é possível observar a frequência anual de nebulosidade, entendida como a porcentagem de dias com mais nuvens do que claros, em cada latitude. Também se observa a variação mensal, por estação e anual.

Em um primeiro olhar (ver fotografia), é possível observar uma correlação entre a latitude e padrões de nebulosidade. Dessa forma, a América equatorial, a bacia do rio Congo na África e o sudeste da Ásia são as regiões com mais nuvens do planeta, até 80% dos dias são nublados. Mesmo que as espécies que habitam esses grandes biomas possam ser diferentes, são ecossistemas que possuem diversas características em comum.

O mapa permite observar também a variação inter-anual. Enquanto as selvas equatoriais apresentam poucas variações que nunca superam 5% de um mês ao outro, os biomas monçônicos da Índia e o sahel africano são os que sofrem maiores diferenças entre os meses nublados e os claros, o que corresponde à temporada de chuvas e a temporada seca.

“Quando visualizamos os dados, destacou-se a claridade com a qual pudemos ver os muitos e diferentes biomas da Terra tendo por base a frequência e o momento dos dias nublados dos últimos 15 anos”, diz Wilson. “Quando passamos de um ecossistema a outro, essas transições mostram-se muito claramente e o melhor é que esses dados permitem observar diretamente esses padrões com uma resolução de um quilômetro”, acrescenta.

O mapa mostra a distribuição das nuvens desde 1999. Em negro as áreas com maior nebulosidade anual. As diferentes cores e sua intensidade mostram as variações mensais.

O mapa mostra a distribuição das nuvens desde 1999. Em negro as áreas com maior nebulosidade anual. As diferentes cores e sua intensidade mostram as variações mensais. Adam Wilson

Essa resolução é uma das maiores contribuições da pesquisa. Pode ser óbvio que a bacia do Congo tenha muitos dias com nuvens, mas com as imagens de satélites é possível observar as diferenças locais, entre a margem norte e sul de um rio e as encostas leste e oeste de uma montanha, por exemplo. Era possível conseguir esse grau de detalhamento nas áreas mais desenvolvidas do planeta, mas não nas menos, que são exatamente as que possuem maior riqueza biológica.

Até agora, os estudos sobre biodiversidade eram baseados na observação direta dos pesquisadores (e, portanto, muito parcial) e as extrapolações de outros sistemas de coleta de dados. Um dos maiores são as estações meteorológicas que, com seus dados de umidade, vento, precipitações, desenham a paisagem climática nas quais vivem as diferentes espécies. Mas a rede de estações também não é suficientemente compacta, de modo que os cientistas precisam interpolar a partir de dados às vezes muito locais e dispersos.

O atlas das nuvens indicou a distribuição geográfica da protea real (sua flor na imagem), um arbusto da faixa de clima mediterrâneo da África do Sul.

O atlas das nuvens indicou a distribuição geográfica da protea real (sua flor na imagem), um arbusto da faixa de clima mediterrâneo da África do Sul. Adam Wilson

“Compreender os padrões espaciais da biodiversidade é fundamental se queremos tomar decisões balizadas sobre como proteger as espécies e gerir a biodiversidade e seus muitos serviços para o futuro”, diz Jetz. Mas acrescenta: “para as regiões que possuem mais diversidade biológica, existe uma escassez real de dados dos locais”.

Esse estudo original, publicado na PLoS Biology, mostra também a íntima e frágil relação entre as nuvens e os chamados bosques nublados. É que essas selvas com a presença constante ou pelo menos regular de nuvens baixas como nevoeiro também não escapam à detecção dos satélites. Essas regiões são ricas em endemismos, de modo que a alteração dos padrões de nebulosidade pela ação humana e a mudança climática pode ter consequências catastróficas.

Os pesquisadores, que não pretendem substituir os modelos existentes, mas acrescentar mais uma camada de conhecimento, quiseram comprovar a validade de seu atlas das nuvens para indicar não só os limites de um determinado ecossistema, mas a distribuição geográfica de duas espécies. Uma é o pequeno trepatroncos montano, um pássaro das selvas montanhosas do norte da América do Sul. A outra é a protea real, um arbusto da região de clima mediterrâneo da África do Sul. Nos dois casos, o que viram nas nuvens foi mais preciso do que os dados oferecidos pelos modelos baseados em registros de precipitações e temperatura.

Mais respeito pela Funceme (O Povo)

ARTIGO 29/02/2016

Fátima Sudário

Na semana que passou, a Funceme atualizou a previsão para a estação de chuva, que se estende até maio na região em que o Ceará está inserido. Reafirmou, em dia de chuva intensa na Capital, probabilidade de chuva em torno de 70% abaixo da média.

Isso é seca braba. É caso de se cobrar atitude do poder público e se compromissar com mobilização social para um cenário desfavorável.
Pela primeira vez, o volume do Castanhão, principal fornecedor da água na Região Metropolitana de Fortaleza, caiu a menos de 10%.

Mas a reação, de um modo geral, se restringe ao ceticismo em relação às previsões da Funceme. Não faltam comentários pejorativos, piadas e ironias, uma espécie de cultura instaurada sempre que se trata da instituição que, além da meteorologia, se dedica a meio ambiente e recursos hídricos.

Penso que há de se atribuir essa postura a imprecisões de previsão, como de fato acontecem, ao uso político de informações como aconteceu no passado ou mesmo à ignorância. Mas me incomoda. A meteorologia lida com parâmetros globais complexos, como temperatura do ar e dos oceanos, velocidade e direção dos ventos, umidade, pressão atmosférica, fenômenos como El Niño… Já avançou consideravelmente na confiabilidade das previsões feitas por meteorologistas, com o uso de dados de satélites, balões atmosféricos e um tanto mais de aparato tecnológico que alimentam modelos matemáticos complicados para desenhar probabilidades, mas não exatidões.

Erra-se, aqui como no resto mundo. Mas geram-se informações de profundo impacto social, econômico, científico e cultural, essenciais a tomadas de decisões, de natureza pública e privada. Algo que nenhum gestor ou comunidade pode dispensar, especialmente em uma região como a nossa, vulnerável às variações climáticas e dependente da chuva. Carecemos de uma troca de mentalidade em relação ao trabalho da Funceme. Falo de respeito mesmo pelo que nos é caro e fundamentalmente necessário.

A propósito, é difícil, mas torço para que a natureza contrarie o prognóstico e caia chuva capaz de garantir um mínimo de segurança hídrica, produtividade e dignidade a um Ceará que muito depende das informações sobre o clima, geradas pela Funceme.

Fátima Sudário

Jornalista do O POVO

Meteorologista da Funceme. “A gente fica feliz com essas chuvas” (O Povo)

AGUANAMBI 282

DOM 24/01/2016

De acordo com o meteorologista da Funceme Raul Fritz, vórtice ciclônico, característico da pré-estação chuvosa, pode render chuvas intensas em janeiro, como ocorreu no ano de 2004

Luana Severo, luanasevero@opovo.com.br

FOTOS IANA SOARES

Segundo Fritz, a ciência climática não chegou a um nível tão preciso para ter uma previsão confiável 

Cotidiano

“Nós não queremos ser Deus, apenas tentamos antecipar o que pode acontecer”. Nascido em Natal, no Rio Grande do Norte, Raul Fritz, de 53 anos, é supervisor da unidade de Tempo e Clima da Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme). Ele, que afirmou não querer tomar o lugar de Deus nas decisões sobre o clima, começou a trabalhar para a Funceme em 1988, ainda como estagiário, pouco após uma estiagem que se prolongou por cinco anos no Estado, entre 1979 e 1983.

Os anos de prática e a especialização em meteorologia por satélite conferem a Fritz a credibilidade necessária para, por meio de mapas, equações numéricas e o comportamento da natureza, estimar se chove ou não no semiárido cearense. Ele compôs, portanto, a equipe de meteorologistas da Fundação que, na última quarta-feira, 20, previu 65% de chances de chuvas abaixo da média entre os meses de fevereiro e abril deste ano prognóstico que, se concretizado, fará o Ceará completar cinco anos de seca.

Em entrevista ao O POVO, ele detalha o parecer, define o sistema climático cearense e comenta sobre a conflituosa relação entre a Funceme e a população, que sustenta o hábito de desconfiar de todas as previsões do órgão, principalmente porque, um dia após a divulgação do prognóstico, o Estado foi tomado de susto por uma intensa chuvarada.

O POVO – Mesmo com o prognóstico desanimador de 65% de chances de chuvas abaixo da média entre os meses de fevereiro e abril, o cearense tem renovado a fé num “bom inverno” devido às recentes precipitações influenciadas pelo Vórtice Ciclônico de Altos Níveis (VCAN). Há a possibilidade de esse fenômeno perdurar?

Raul Fritz – Sim. Esse sistema que está atuando agora apresenta maior intensidade em janeiro. Ele pode perdurar até meados de fevereiro, principalmente pelas circunstâncias meteorológicas atmosféricas que a gente vê no momento.

OP – Por que o VCAN não tem relação com a quadra chuvosa?

Raul – A quadra chuvosa é caracterizada pela atuação de um sistema muito importante para o Norte e o Nordeste, que é a Zona de Convergência Intertropical (ZCI). É o sistema que traz chuvas de forma mais regular para o Estado. O vórtice é muito irregular. Tem anos em que ele traz boas chuvas, tem anos em que praticamente não traz.

OP – O senhor consegue lembrar outra época em que o VCAN teve uma atuação importante em relação às chuvas?

Raul – Em 2004, houve muita chuva no período de janeiro. Em fevereiro também tivemos boas chuvas, mas, principalmente, em janeiro, ao ponto de encher o reservatório do Castanhão, que tinha sido recém-construído. Mas, os meses seguintes a esses dois não foram bons meses de chuva, então é possível a gente ter esse período de agora bastante chuvoso, seguido de chuvas mais escassas.

OP – O que impulsiona o quadro de estiagem
no Ceará?

Raul – Geograficamente, existem fatores naturais que originam um estado climático de semiaridez. É uma região que tem uma irregularidade muito grande na distribuição das chuvas, tanto ao longo do território como no tempo. Chuvas, às vezes, acontecem bem num período do ano e ruim no seguinte, e se concentram no primeiro semestre, principalmente entre fevereiro e maio, que a gente chama de ‘quadra chuvosa’. Aí tem a pré-estação que, em alguns anos, se mostra boa. Aparenta ser o caso deste ano.

OP – A última seca prolongada no Ceará, que durou cinco anos, ocorreu de 1979 a 1983. Estamos, atualmente, seguindo para o mesmo quadro. O que é capaz de interromper esse ciclo?

Raul – O ciclo geralmente não ultrapassa ou tende a não ultrapassar esse período. A própria variabilidade climática natural interrompe. Poucos casos chegam a ser tão extensos. É mais frequente de dois a três anos. Mas, às vezes, podem se estender a esses dois exemplos, de cinco anos seguidos de chuvas abaixo da média. Podemos ter, também, alguma influência do aquecimento global, que, possivelmente, perturba as condições naturais. Fenômenos como El Niños intensos contribuem. Quando eles chegam e se instalam no Oceano Pacífico, tendem a ampliar esse quadro grave de seca, como é o caso de agora. Esse El Niño que está atuando no momento é equivalente ao de 1997 e 1998, que provocou uma grande seca.

OP – É uma tendência esse panorama de grandes secas intercaladas?

Raul – Sim, e é mais frequente a gente ter anos com chuvas entre normal e abaixo da média, do que anos acima da média.

OP – A sabedoria popular, na voz dos profetas da chuva, aposta em precipitações regulares este ano. Em que ponto ela converge com o conhecimento científico?

Raul – O profeta da chuva percebe, pela análise da natureza, que os seres vivos estão reagindo às condições de tempo e, a partir disso, elabora uma previsão de longo prazo, que é climática. Mas, essa previsão climática pode não corresponder exatamente a um prolongamento daquela variação que ocorreu naquele momento em que ele fez a avaliação. Se acontecer, ele acha que acertou a previsão de clima. Se não, ele considera que errou. Mas, pode coincidir que essa variação a curto prazo se repita e se transforme em longo prazo. Aí é o ponto em que converge. A Funceme tenta antecipar o que pode acontecer num prazo maior, envolvendo três meses a frente. É um exercício muito difícil.

OP – Geralmente, há uma descrença da população em torno das previsões da Funceme. Como desmistificar isso?

Raul – A previsão oferece probabilidades e qualquer uma delas pode acontecer, mas, a gente indica a mais provável. São três que nós lançamos. Acontece que a população não consegue entender essa informação, que é padrão internacional de divulgação. Acha que é uma coisa determinística. Que, se a Funceme previu como maior probabilidade termos chuvas abaixo da média em certo período, acha que já previu seca. Mas, a mais provável é essa mesmo, até para alertar às pessoas com atividades que dependem das chuvas e ao próprio Governo a tomarem precauções, se prevenirem e não só reagirem a uma seca já instalada.

OP – A Funceme, então, também se surpreende com as ocorrências de menor probabilidade, como o VCAN?

Raul – Sim, porque esses vórtices são de difícil previsibilidade. A ciência não conseguiu chegar num nível de precisão grande para ter uma previsão confiável para esse período (de pré-estação chuvosa). De qualquer forma, nos é exigido dar alguma ideia do que possa acontecer. É um risco muito grande que a Funceme assume. A gente sofre críticas por isso. Por exemplo, a gente lançou a previsão de chuvas abaixo da média, aí no outro dia vem uma chuva muito intensa. As pessoas não compreendem, acham que essas chuvas do momento vão se prolongar até o restante da temporada. Apesar da crítica da população, que chega até a pedir para fechar o órgão, a gente fica feliz com a chegada
dessas chuvas.

Frase

“A gente lançou a previsão de chuvas abaixo da média, aí no outro dia vem uma chuva muito intensa. As pessoas não compreendem, acham que essas chuvas do momento vão se prolongar até o restante da temporada”

Raul Fritz, meteorologista da Funceme

VIDEO

Raul Fritz, o cientista da chuva

IANA SOARES/O POVO

Nascido em Natal, no Rio Grande do Norte, Raul Fritz, de 53 anos, é supervisor da unidade de Tempo e Clima da Fundação Cearense de Meteorologia e Recursos Hídricos (Funceme). Ele começou a trabalhar para a Funceme em 1988, ainda como estagiário, pouco após uma estiagem que se prolongou por cinco anos no Estado, entre 1979 e 1983.

Dudas sobre El Niño retrasan preparación ante desastres (SciDev Net)

Dudas sobre El Niño retrasan preparación ante desastres

Crédito de la imagen: Patrick Brown/Panos

27/10/15

Martín De Ambrosio

De un vistazo

  • Efectos del fenómeno aún son confusos a lo largo del continente
  • No hay certeza, pero cruzarse de brazos no es opción, según Organización Panamericana de la Salud
  • Hay consenso científico del 95 por ciento sobre posibilidades de un El Niño fuerte

Los desacuerdos que existen entre los científicos sobre la posibilidad de que Centro y Sudamérica sufran o no un fuerte evento El Niño están generando cierto retraso en las preparaciones, según advierten las principales organizaciones que trabajan en el clima de la región.

Algunos investigadores sudamericanos aún tienen dudas sobre la forma cómo se desarrolla el evento este año. Esta incertidumbre impacta en los funcionarios y los estados, que deberían actuar cuanto antes para prevenir los peores escenarios, incluyendo muertes debido a desastres naturales, reclaman las organizaciones meteorológicas.

Eduardo Zambrano, investigador del Centro de Investigación Internacional sobre el Fenómeno de El Niño (CIIFEN) en Ecuador, y uno de los centros regionales de la Organización Meteorológica Mundial, dice que el problema es que los efectos del fenómeno todavía no han sido claros y evidentes en todo el continente.

“Algunas imágenes de satélite nos muestran un Océano Pacífico muy caliente, una de las características de El Niño”.

Willian Alva León, presidente de la Sociedad Meteorológica del Perú

“De todos modos podemos hablar sobre las extremas sequías en el noreste de Brasil, Venezuela y la zona del Caribe”, dice, y menciona además las inusualmente fuertes lluvias en el desierto de Atacama en Chile desde marzo y las inundaciones en zonas de Argentina, Uruguay y Paraguay.

El Niño alcanza su pico cuando una masa de aguas cálidas para los habituales parámetros del este del Océano Pacífico, se mueve de norte a sur y toca costas peruanas y ecuatorianas. Este movimiento causa efectos en cascada y estragos en todo el sistema de América Central y del Sur, convirtiendo las áridas regiones altas en lluviosas, al tiempo que se presentan sequías en las tierras bajas y tormentas sobre el Caribe.

Pero El Niño continúa siendo de difícil predicción debido a sus muy diferentes impactos. Los científicos, según Zambrano, esperaban al Niño el año pasado “cuando todas las alarmas sonaron, y luego no pasó nada demasiado extraordinario debido a un cambio en la dirección de los vientos”.

Tras ese error, muchas organizaciones prefirieron la cautela para evitar el alarmismo. “Algunas imágenes de satélite nos muestran un Océano Pacífico muy caliente, una de las características de El Niño”, dice Willian Alva León, presidente de la Sociedad Meteorológica del Perú. Pero, agrega, este calor no se mueve al sudeste, hacia las costas peruanas, como sucedería en caso del evento El Niño.

Alva León cree que los peores efectos ya sucedieron este año, lo que significa que el fenómeno está en retirada. “El Niño tiene un límite de energía y creo que ya ha sido alcanzado este año”, dice.

Este desacuerdo entre las instituciones de investigación del clima preocupa a quienes generan políticas, pues necesitan guías claras para iniciar las preparaciones necesarias del caso. Ciro Ugarte, asesor regional del área de Preparativos para Emergencia y Socorro en casos de Desastrede la Organización Panamericana de la Salud, dice que es obligatorio actuar como si El Niño en efecto estuviera en proceso para asegurar que el continente enfrente las posibles consecuencias.

“Estar preparados es importante porque reduce el impacto del fenómeno así como otras enfermedades que hoy son epidémicas”, dice.

Para asegurar el grado de probabilidad de El Niño, algunos científicos usan modelos que abstraen datos de la realidad y generan predicciones. María Teresa Martínez, subdirectora de meteorología del Instituto de Hidrología, Meteorología y Estudios Ambientales de Colombia, señala que los modelos más confiables predijeron en marzo que había entre un 50 y un 60 por ciento de posibilidad de un evento El Niño. “Ahora El Niño se desarrolla con fuerza desde su etapa de formación hacia la etapa de madurez, que será alcanzada en diciembre”, señala.

Ugarte admite que no hay certezas, pero dice que para su organización “no hacer nada no es una opción”.

“Como creadores de políticas de prevención, lo que tenemos que hacer es usar lo que es el consenso entre los científicos, y hoy ese consenso dice que hay un 95% de posibilidades de tener un fuerte o muy fuerte evento El Niño”, dice.

Fundação esotérica promete desviar chuva do Rock in Rio para o Espírito Santo (Gazeta – ES)

16/09/2015 – 11h35 – Atualizado em 16/09/2015 – 12h07
Autor: Wing Costa | wbertulani@redegazeta.com.br

O espírito do Cacique Cobra Coral desviaria as chuvas que poderiam afetar o evento para minorar os efeitos da seca no Rio Doce

Fundação Cacique Cobra Coral afirma que pode controlar fenômenos naturais. Foto: Reprodução

Qual a ligação entre o Rio Doce, que corta o Espírito Santo, o Rio Paraíba do Sul e o evento de música Rock in Rio 2015? Além da óbvia presença da palavra “rio” no nome, todos eles sofrem influência de um espírito – incorporado pela médium Adelaide Scritori – que teria o poder de alterar fenômenos naturais.

A Fundação Cacique Cobra Coral, operada por Adelaide, é contratada pelos organizadores do Rock in Rio desde o segundo festival (após lamaçal histórico na edição de 1985) para desviar as chuvas do local do evento.

Como a fundação trabalha somente para um bem comum, como explica o porta-voz da entidade esotérica, Osmar Santos, desta vez as chuvas serão deslocadas para o Espírito Santo, para “minorar os efeitos da seca no Rio Doce”.

“Estamos fazendo um trabalho também para o Governo do Rio de Janeiro, para elevar o nível do Rio Paraíba do Sul. Como tudo é um ciclo, desabar água sobre o Estado não seria uma solução, então tem que chover em São Paulo, por exemplo. Isso afeta diretamente o Espírito Santo, que passa por um período seco”, explica o representante da entidade.

“Aí choveu essa semana, não foi?”, perguntou para a reportagem. Estamos trabalhando para fazer desse inverno um inverno úmido. Conhecemos a situação do Espírito Santo porque já fizemos muitos trabalhos aí a pedido do senador Gerson Camata”, conta.

O Cacique Cobra Coral, por meio da médium, também teria evitado chuvas em Olimpíadas e outros eventos por todo o globo. “As nuvens estavam feias em Londres e a previsão dizia que choveria às 20h. Nesse horário seria a abertura das Olimpíadas. A médium estava em Dublin – de onde vinham as nuvens – e conseguiu remanejar”, disse o representante.

Mas nada disso teria acontecido para um bem particular, como ele explica, já que, na época, estariam acontecendo muitas queimadas em Portugal e Espanha, e a força do espírito indígena teria feito com que as chuvas, além de não atrapalharem o evento esportivo, também apagassem os incêndios na Europa.

O espírito no Espírito Santo

Foto: Vitor Jubini – GZ. Gerson Camata atestou os poderes do Cacique Cobra Coral

A entidade atuou no Espírito Santo a pedido do ex-senador Gerson Camata, que lembra com bom humor a passagem. Ao ser perguntado, soltou um “ah, o cacique”, acompanhado de risadas. Aconteceu nos idos de 87 ou 88, se a memória do senador permite a margem de erro.

Como você chegou até o cacique?

Isso foi numa época de seca muito forte no Norte do Estado. Um senador colega me indicou e eu liguei.

O contato foi fácil?

Sim, é uma mulher que faz essas operações.

A fundação pediu algo para atuar no Estado?

Não, eles não me cobraram nada. Só me mandaram um mapa meteorológico e perguntaram: “Onde você quer que chova?”, logo apontei Marilândia e eles prometeram que antes da meia-noite a água passaria por cima da ponte.

E choveu?

Olha, conversei com o prefeito da época. Ele me ligou no dia da promessa, às 22h, e disse que o céu começara a nublar. Quando deu 1h da manhã ele me ligou pedindo para fazer o povo parar se não morreriam todos afogados, tanta água que era.

Então você atesta o poder do Cacique?

Poder eu não sei, mas que choveu, choveu.

Fonte: Gazeta Online

Artigos de pesquisadores do INPE diagnosticam as condições de seca no Sudeste (INPE)

JC, 5246, 24 de agosto de 2015

O texto foi publicado na versão online da revista Theoretical and Applied Climatology

Publicado na versão online da revista Theoretical and Applied Climatology, o artigo Precipitation diagnostics of an exceptionally dry event in São Paulo, Brazil apresenta um diagnóstico das condições de déficit de chuva observadas sobre o sudeste do Estado de São Paulo, incluindo sua região metropolitana, durante os dois últimos verões (2013/2014 e 2014/2015).

Segundo Caio Coelho, do Centro de Previsão do Tempo e Estudos Climáticos (CPTEC) do Instituto Nacional de Pesquisas Espaciais (INPE) e um dos autores do trabalho, o artigo responde a uma série de questões sobre a manifestação de eventos extremos de seca.

Os resultados obtidos pelos pesquisadores da Divisão de Operações do CPTEC/INPE revelam a excepcionalidade do déficit de chuva observado durante o verão 2013/2014, quando comparado a outros verões desde 1961/62, e que a região estudada vem sofrendo com déficit de chuva desde o final da década de 1990. Eventos de seca semelhantes foram observados no passado, porém de menor magnitude em termos de déficit de chuva. Um dos fatores que contribuiu para o déficit expressivo de precipitação durante o verão 2013/2014 foi o término exageradamente antecipado da estação chuvosa.

Outro trabalho do CPTEC/INPE publicado na versão online da revista Climate Dynamics, realizado em colaboração com pesquisadores da Universidade de São Paulo e Universidade Federal de Itajubá, destaca que a seca sobre o Sudeste durante o verão 2014 teve como raiz as condições de chuvas anômalas na região tropical ao norte da Austrália, desencadeando uma sequência de processos entre a região tropical e extratropical do oceano Pacífico, até atingir a região Sudeste do Brasil e oceano Atlântico adjacente.

Este trabalho, intitulado The 2014 southeast Brazil austral summer drought: regional scale mechanisms and teleconnections, revela o estabelecimento de um sistema anômalo de alta pressão sobre o oceano Atlântico adjacente aquecido, que forçou os sistemas frontais a realizar trajetórias oceânicas, favoreceu a manutenção do aquecimento oceânico através da incidência de radiação solar, transportou umidade da Amazônia para o sul do Brasil, e desfavoreceu a formação de eventos de Zona de Convergência do Atlântico Sul, um dos principais mecanismos de produção de chuva sobre a região Sudeste do Brasil.

Mais detalhes sobre os estudos na página: http://www.cptec.inpe.br/noticias/noticia/127760

(Inpe)

Stop burning fossil fuels now: there is no CO2 ‘technofix’, scientists warn (The Guardian)

Researchers have demonstrated that even if a geoengineering solution to CO2 emissions could be found, it wouldn’t be enough to save the oceans

“The chemical echo of this century’s CO2 pollutiuon will reverberate for thousands of years,” said the report’s co-author, Hans Joachim Schellnhuber

“The chemical echo of this century’s CO2 pollutiuon will reverberate for thousands of years,” said the report’s co-author, Hans Joachim Schellnhuber Photograph: Doug Perrine/Design Pics/Corbis

German researchers have demonstrated once again that the best way to limit climate change is to stop burning fossil fuels now.

In a “thought experiment” they tried another option: the future dramatic removal of huge volumes of carbon dioxide from the atmosphere. This would, they concluded, return the atmosphere to the greenhouse gas concentrations that existed for most of human history – but it wouldn’t save the oceans.

That is, the oceans would stay warmer, and more acidic, for thousands of years, and the consequences for marine life could be catastrophic.

The research, published in Nature Climate Change today delivers yet another demonstration that there is so far no feasible “technofix” that would allow humans to go on mining and drilling for coal, oil and gas (known as the “business as usual” scenario), and then geoengineer a solution when climate change becomes calamitous.

Sabine Mathesius (of the Helmholtz Centre for Ocean Research in Kiel and the Potsdam Institute for Climate Impact Research) and colleagues decided to model what could be done with an as-yet-unproven technology called carbon dioxide removal. One example would be to grow huge numbers of trees, burn them, trap the carbon dioxide, compress it and bury it somewhere. Nobody knows if this can be done, but Dr Mathesius and her fellow scientists didn’t worry about that.

They calculated that it might plausibly be possible to remove carbon dioxide from the atmosphere at the rate of 90 billion tons a year. This is twice what is spilled into the air from factory chimneys and motor exhausts right now.

The scientists hypothesised a world that went on burning fossil fuels at an accelerating rate – and then adopted an as-yet-unproven high technology carbon dioxide removal technique.

“Interestingly, it turns out that after ‘business as usual’ until 2150, even taking such enormous amounts of CO2 from the atmosphere wouldn’t help the deep ocean that much – after the acidified water has been transported by large-scale ocean circulation to great depths, it is out of reach for many centuries, no matter how much CO2 is removed from the atmosphere,” said a co-author, Ken Caldeira, who is normally based at the Carnegie Institution in the US.

The oceans cover 70% of the globe. By 2500, ocean surface temperatures would have increased by 5C (41F) and the chemistry of the ocean waters would have shifted towards levels of acidity that would make it difficult for fish and shellfish to flourish. Warmer waters hold less dissolved oxygen. Ocean currents, too, would probably change.

But while change happens in the atmosphere over tens of years, change in the ocean surface takes centuries, and in the deep oceans, millennia. So even if atmospheric temperatures were restored to pre-Industrial Revolution levels, the oceans would continue to experience climatic catastrophe.

“In the deep ocean, the chemical echo of this century’s CO2 pollution will reverberate for thousands of years,” said co-author Hans Joachim Schellnhuber, who directs the Potsdam Institute. “If we do not implement emissions reductions measures in line with the 2C (35.6F) target in time, we will not be able to preserve ocean life as we know it.”

Climate Seer James Hansen Issues His Direst Forecast Yet (The Daily Beast) + other sources, and repercussions

A polar bear walks in the snow near the Hudson Bay waiting for the bay to freeze, 13 November 2007, outside Churchill, Mantioba, Canada. Polar bears return to Churchill, the polar bear capital of the world, to hunt for seals on the icepack every year at this time and remain on the icepack feeding on seals until the spring thaw.   AFP PHOTO/Paul J. Richards (Photo credit should read PAUL J. RICHARDS/AFP/Getty Images)

Paul J Richards/AFP/Getty

Mark Hertsgaard 

07.20.151:00 AM ET

James Hansen’s new study explodes conventional goals of climate diplomacy and warns of 10 feet of sea level rise before 2100. The good news is, we can fix it.

James Hansen, the former NASA scientist whose congressional testimony put global warming on the world’s agenda a quarter-century ago, is now warning that humanity could confront “sea level rise of several meters” before the end of the century unless greenhouse gas emissions are slashed much faster than currently contemplated.This roughly 10 feet of sea level rise—well beyond previous estimates—would render coastal cities such as New York, London, and Shanghai uninhabitable.  “Parts of [our coastal cities] would still be sticking above the water,” Hansen says, “but you couldn’t live there.”

James Hanson

Columbia University

This apocalyptic scenario illustrates why the goal of limiting temperature rise to 2 degrees Celsius is not the safe “guardrail” most politicians and media coverage imply it is, argue Hansen and 16 colleagues in a blockbuster study they are publishing this week in the peer-reviewed journal Atmospheric Chemistry and Physics. On the contrary, a 2 C future would be “highly dangerous.”

If Hansen is right—and he has been right, sooner, about the big issues in climate science longer than anyone—the implications are vast and profound.

Physically, Hansen’s findings mean that Earth’s ice is melting and its seas are rising much faster than expected. Other scientists have offered less extreme findings; the United Nations Intergovernmental Panel on Climate Change (IPCC) has projected closer to 3 feet of sea level rise by the end of the century, an amount experts say will be difficult enough to cope with. (Three feet of sea level rise would put runways of all three New York City-area airports underwater unless protective barriers were erected. The same holds for airports in the San Francisco Bay Area.)

Worldwide, approximately $3 trillion worth infrastructure vital to civilization such as water treatment plants, power stations, and highways are located at or below 3 feet of sea level, according to the Stern Review, a comprehensive analysis published by the British government.

Hansen’s track record commands respect. From the time the soft-spoken Iowan told the U.S. Senate in 1988 that man-made global warming was no longer a theory but had in fact begun and threatened unparalleled disaster, he has consistently been ahead of the scientific curve.

Hansen has long suspected that computer models underestimated how sensitive Earth’s ice sheets were to rising temperatures. Indeed, the IPCC excluded ice sheet melt altogether from its calculations of sea level rise. For their study, Hansen and his colleagues combined ancient paleo-climate data with new satellite readings and an improved model of the climate system to demonstrate that ice sheets can melt at a “non-linear” rate: rather than an incremental melting as Earth’s poles inexorably warm, ice sheets might melt at exponential rates, shedding dangerous amounts of mass in a matter of decades, not millennia. In fact, current observations indicate that some ice sheets already are melting this rapidly.

“Prior to this paper I suspected that to be the case,” Hansen told The Daily Beast. “Now we have evidence to make that statement based on much more than suspicion.”

The Nature Climate Change study and Hansen’s new paper give credence to the many developing nations and climate justice advocates who have called for more ambitious action.

Politically, Hansen’s new projections amount to a huge headache for diplomats, activists, and anyone else hoping that a much-anticipated global climate summit the United Nations is convening in Paris in December will put the world on a safe path. President Barack Obama and other world leaders must now reckon with the possibility that the 2 degrees goal they affirmed at the Copenhagen summit in 2009 is actually a recipe for catastrophe. In effect, Hansen’s study explodes what has long been the goal of conventional climate diplomacy.

More troubling, honoring even the conventional 2 degrees C target has so far proven extremely challenging on political and economic grounds. Current emission trajectories put the world on track towards a staggering 4 degrees of warming before the end of the century, an amount almost certainly beyond civilization’s coping capacity. In preparation for the Paris summit, governments have begun announcing commitments to reduce emissions, but to date these commitments are falling well short of satisfying the 2 degrees goal. Now, factor in the possibility that even 2 degrees is too much and many negotiators may be tempted to throw up their hands in despair.

They shouldn’t. New climate science brings good news as well as bad.  Humanity can limit temperature rise to 1.5 degrees C if it so chooses, according to a little-noticed study by experts at the Potsdam Institute for Climate Impacts (now perhaps the world’s foremost climate research center) and the International Institute for Applied Systems Analysis published in Nature Climate Change in May.

“Actions for returning global warming to below 1.5 degrees Celsius by 2100 are in many ways similar to those limiting warming to below 2 degrees Celsius,” said Joeri Rogelj, a lead author of the study. “However … emission reductions need to scale up swiftly in the next decades.” And there’s a significant catch: Even this relatively optimistic study concludes that it’s too late to prevent global temperature rising by 2 degrees C. But this overshoot of the 2 C target can be made temporary, the study argues; the total increase can be brought back down to 1.5 C later in the century.

Besides the faster emissions reductions Rogelj referenced, two additional tools are essential, the study outlines. Energy efficiency—shifting to less wasteful lighting, appliances, vehicles, building materials and the like—is already the cheapest, fastest way to reduce emissions. Improved efficiency has made great progress in recent years but will have to accelerate, especially in emerging economies such as China and India.

Also necessary will be breakthroughs in so-called “carbon negative” technologies. Call it the photosynthesis option: because plants inhale carbon dioxide and store it in their roots, stems, and leaves, one can remove carbon from the atmosphere by growing trees, planting cover crops, burying charred plant materials underground, and other kindred methods. In effect, carbon negative technologies can turn back the clock on global warming, making the aforementioned descent from the 2 C overshoot to the 1.5 C goal later in this century theoretically possible. Carbon-negative technologies thus far remain unproven at the scale needed, however; more research and deployment is required, according to the study.

Together, the Nature Climate Change study and Hansen’s new paper give credence to the many developing nations and climate justice advocates who have called for more ambitious action. The authors of the Nature Climate Changestudy point out that the 1.5 degrees goal “is supported by more than 100 countries worldwide, including those most vulnerable to climate change.” In May, the governments of 20 of those countries, including the Philippines, Costa Rica, Kenya, and Bangladesh, declared the 2 degrees target “inadequate” and called for governments to “reconsider” it in Paris.

Hansen too is confident that the world “could actually come in well under 2 degrees, if we make the price of fossil fuels honest.”

That means making the market price of gasoline and other products derived from fossil fuels reflect the enormous costs that burning those fuels currently externalizes onto society as a whole. Economists from left to right have advocated achieving this by putting a rising fee or tax on fossil fuels. This would give businesses, governments, and other consumers an incentive to shift to non-carbon fuels such as solar, wind, nuclear, and, best of all, increased energy efficiency. (The cheapest and cleanest fuel is the fuel you don’t burn in the first place.)

But putting a fee on fossil fuels will raise their price to consumers, threatening individual budgets and broader economic prospects, as opponents will surely point out. Nevertheless, higher prices for carbon-based fuels need not have injurious economic effects if the fees driving those higher prices are returned to the public to spend as it wishes. It’s been done that way for years with great success in Alaska, where all residents receive an annual check in compensation for the impact the Alaskan oil pipeline has on the state.

“Tax Pollution, Pay People” is the bumper sticker summary coined by activists at the Citizens Climate Lobby. Legislation to this effect has been introduced in both houses of the U.S. Congress.

Meanwhile, there are also a host of other reasons to believe it’s not too late to preserve a livable climate for young people and future generations.

The transition away from fossil fuels has begun and is gaining speed and legitimacy. In 2014, global greenhouse gas emissions remained flat even as the world economy grew—a first. There has been a spectacular boom in wind and solar energy, including in developing countries, as their prices plummet. These technologies now qualify as a “disruptive” economic force that promises further breakthroughs, said Achim Steiner, executive director of the UN Environment Programme.

Coal, the most carbon-intensive conventional fossil fuel, is in a death spiral, partly thanks to another piece of encouraging news: the historic climate agreement the U.S. and China reached last November, which envisions both nations slashing coal consumption (as China is already doing). Hammering another nail into coal’s coffin, the leaders of Great Britain’s three main political parties pledged to phase out coal, no matter who won the general elections last May.

“If you look at the long-term [for coal], it’s not getting any better,” said Standard & Poor’s Aneesh Prabhu when S&P downgraded coal company bonds to junk status. “It’s a secular decline,” not a mere cyclical downturn.

Last but not least, a vibrant mass movement has arisen to fight climate change, most visibly manifested when hundreds of thousands of people thronged the streets of New York City last September, demanding action from global leaders gathered at the UN. The rally was impressive enough that it led oil and gas giant ExxonMobil to increase its internal estimate of how likely the U.S. government is to take strong action. “That many people marching is clearly going to put pressure on government to do something,” an ExxonMobil spokesman told Bloomberg Businessweek.

The climate challenge has long amounted to a race between the imperatives of science and the contingencies of politics. With Hansen’s paper, the science has gotten harsher, even as the Nature Climate Change study affirms that humanity can still choose life, if it will. The question now is how the politics will respond—now, at Paris in December, and beyond.

Mark Hertsgaard has reported on politics, culture, and the environment from more than 20 countries and written six books, including “HOT: Living Through the Next Fifty Years on Earth.”

*   *   *

Experts make dire prediction about sea levels (CBS)

VIDEO

In the future, there could be major flooding along every coast. So says a new study that warns the world’s seas are rising.

Ever-warming oceans that are melting polar ice could raise sea levels 15 feet in the next 50 to 100 years, NASA’s former climate chief now says. That’s five times higher than previous predictions.

“This is the biggest threat the planet faces,” said James Hansen, the co-author of the new journal article raising that alarm scenario.

“If we get sea level rise of several meters, all coastal cities become dysfunctional,” he said. “The implications of this are just incalculable.”

If ocean levels rise just 10 feet, areas like Miami, Boston, Seattle and New York City would face flooding.

The melting ice would cool ocean surfaces at the poles even more. While the overall climate continues to warm. The temperature difference would fuel even more volatile weather.

“As the atmosphere gets warmer and there’s more water vapor, that’s going to drive stronger thunderstorms, stronger hurricanes, stronger tornadoes, because they all get their energy from the water vapor,” said Hansen.

Nearly a decade ago, Hansen told “60 Minutes” we had 10 years to get global warming under control, or we would reach “tipping point.”

“It will be a situation that is out of our control,” he said. “We’re essentially at the edge of that. That’s why this year is a critical year.”

Critical because of a United Nations meeting in Paris that is designed to reach legally binding agreements on carbons emissions, those greenhouse gases that create global warming.

*   *   *

Sea Levels Could Rise Much Faster than Thought (Climate Denial Crock of the Week)

with Peter SinclairJuly 21, 2015

Washington Post:

James Hansen has often been out ahead of his scientific colleagues.

With his 1988 congressional testimony, the then-NASA scientist is credited with putting the global warming issue on the map by saying that a warming trend had already begun. “It is time to stop waffling so much and say that the evidence is pretty strong that the greenhouse effect is here,” Hansen famously testified.

Now Hansen — who retired in 2013 from his NASA post, and is currently an adjunct professor at Columbia University’s Earth Institute — is publishing what he says may be his most important paper. Along with 16 other researchers — including leading experts on the Greenland and Antarctic ice sheets — he has authored a lengthy study outlining an scenario of potentially rapid sea level rise combined with more intense storm systems.

It’s an alarming picture of where the planet could be headed — and hard to ignore, given its author. But it may also meet with considerable skepticism in the broader scientific community, given that its scenarios of sea level rise occur more rapidly than those ratified by the United Nations’ Intergovernmental Panel on Climate Change in its latest assessment of the state of climate science, published in 2013.

In the new study, Hansen and his colleagues suggest that the “doubling time” for ice loss from West Antarctica — the time period over which the amount of loss could double — could be as short as 10 years. In other words, a non-linear process could be at work, triggering major sea level rise in a time frame of 50 to 200 years. By contrast, Hansen and colleagues note, the IPCC assumed more of a linear process, suggesting only around 1 meter of sea level rise, at most, by 2100.

Here, a clip from our extended interview with Eric Rignot in December of 2014.  Rignot is one of the co-authors of the new study.

Slate:

The study—written by James Hansen, NASA’s former lead climate scientist, and 16 co-authors, many of whom are considered among the top in their fields—concludes that glaciers in Greenland and Antarctica will melt 10 times faster than previous consensus estimates, resulting in sea level rise of at least 10 feet in as little as 50 years. The study, which has not yet been peer reviewed, brings new importance to a feedback loop in the ocean near Antarctica that results in cooler freshwater from melting glaciers forcing warmer, saltier water underneath the ice sheets, speeding up the melting rate. Hansen, who is known for being alarmist and also right, acknowledges that his study implies change far beyond previous consensus estimates. In a conference call with reporters, he said he hoped the new findings would be “substantially more persuasive than anything previously published.” I certainly find them to be.

We conclude that continued high emissions will make multi-meter sea level rise practically unavoidable and likely to occur this century. Social disruption and economic consequences of such large sea level rise could be devastating. It is not difficult to imagine that conflicts arising from forced migrations and economic collapse might make the planet ungovernable, threatening the fabric of civilization.

The science of ice melt rates is advancing so fast, scientists have generally been reluctant to put a number to what is essentially an unpredictable, non-linear response of ice sheets to a steadily warming ocean. With Hansen’s new study, that changes in a dramatic way. One of the study’s co-authors is Eric Rignot, whose own study last year found that glacial melt from West Antarctica now appears to be “unstoppable.” Chris Mooney, writing for Mother Jonescalled that study a “holy shit” moment for the climate.

Daily Beast:

New climate science brings good news as well as bad.  Humanity can limit temperature rise to 1.5 degrees C if it so chooses, according to a little-noticed study by experts at the Potsdam Institute for Climate Impacts (now perhaps the world’s foremost climate research center) and the International Institute for Applied Systems Analysis published in Nature Climate Changein May.

shanghai500

“Actions for returning global warming to below 1.5 degrees Celsius by 2100 are in many ways similar to those limiting warming to below 2 degrees Celsius,” said Joeri Rogelj, a lead author of the study. “However … emission reductions need to scale up swiftly in the next decades.” And there’s a significant catch: Even this relatively optimistic study concludes that it’s too late to prevent global temperature rising by 2 degrees C. But this overshoot of the 2 C target can be made temporary, the study argues; the total increase can be brought back down to 1.5 C later in the century.

Besides the faster emissions reductions Rogelj referenced, two additional tools are essential, the study outlines. Energy efficiency—shifting to less wasteful lighting, appliances, vehicles, building materials and the like—is already the cheapest, fastest way to reduce emissions. Improved efficiency has made great progress in recent years but will have to accelerate, especially in emerging economies such as China and India.

Also necessary will be breakthroughs in so-called “carbon negative” technologies. Call it the photosynthesis option: because plants inhale carbon dioxide and store it in their roots, stems, and leaves, one can remove carbon from the atmosphere by growing trees, planting cover crops, burying charred plant materials underground, and other kindred methods. In effect, carbon negative technologies can turn back the clock on global warming, making the aforementioned descent from the 2 C overshoot to the 1.5 C goal later in this century theoretically possible. Carbon-negative technologies thus far remain unproven at the scale needed, however; more research and deployment is required, according to the study.

*   *   *

Earth’s Most Famous Climate Scientist Issues Bombshell Sea Level Warning (Slate)

495456719-single-family-homes-on-islands-and-condo-buildings-on

Monday’s new study greatly increases the potential for catastrophic near-term sea level rise. Here, Miami Beach, among the most vulnerable cities to sea level rise in the world. Photo by Joe Raedle/Getty Images

In what may prove to be a turning point for political action on climate change, a breathtaking new study casts extreme doubt about the near-term stability of global sea levels.

The study—written by James Hansen, NASA’s former lead climate scientist, and 16 co-authors, many of whom are considered among the top in their fields—concludes that glaciers in Greenland and Antarctica will melt 10 times faster than previous consensus estimates, resulting in sea level rise of at least 10 feet in as little as 50 years. The study, which has not yet been peer-reviewed, brings new importance to a feedback loop in the ocean near Antarctica that results in cooler freshwater from melting glaciers forcing warmer, saltier water underneath the ice sheets, speeding up the melting rate. Hansen, who is known for being alarmist and also right, acknowledges that his study implies change far beyond previous consensus estimates. In a conference call with reporters, he said he hoped the new findings would be “substantially more persuasive than anything previously published.” I certainly find them to be.

To come to their findings, the authors used a mixture of paleoclimate records, computer models, and observations of current rates of sea level rise, but “the real world is moving somewhat faster than the model,” Hansen says.

Hansen’s study does not attempt to predict the precise timing of the feedback loop, only that it is “likely” to occur this century. The implications are mindboggling: In the study’s likely scenario, New York City—and every other coastal city on the planet—may only have a few more decades of habitability left. That dire prediction, in Hansen’s view, requires “emergency cooperation among nations.”

We conclude that continued high emissions will make multi-meter sea level rise practically unavoidable and likely to occur this century. Social disruption and economic consequences of such large sea level rise could be devastating. It is not difficult to imagine that conflicts arising from forced migrations and economic collapse might make the planet ungovernable, threatening the fabric of civilization.

The science of ice melt rates is advancing so fast, scientists have generally been reluctant to put a number to what is essentially an unpredictable, nonlinear response of ice sheets to a steadily warming ocean. With Hansen’s new study, that changes in a dramatic way. One of the study’s co-authors is Eric Rignot, whose own study last year found that glacial melt from West Antarctica now appears to be “unstoppable.” Chris Mooney, writing for Mother Jonescalled that study a “holy shit” moment for the climate.

One necessary note of caution: Hansen’s study comes via a nontraditional publishing decision by its authors. The study will be published in Atmospheric Chemistry and Physics, an open-access “discussion” journal, and will not have formal peer review prior to its appearance online later this week. [Update, July 23: The paper is now available.] The complete discussion draft circulated to journalists was 66 pages long, and included more than 300 references. The peer review will take place in real time, with responses to the work by other scientists also published online. Hansen said this publishing timeline was necessary to make the work public as soon as possible before global negotiators meet in Paris later this year. Still, the lack of traditional peer review and the fact that this study’s results go far beyond what’s been previously published will likely bring increased scrutiny. On Twitter, Ruth Mottram, a climate scientist whose work focuses on Greenland and the Arctic, was skeptical of such enormous rates of near-term sea level rise, though she defended Hansen’s decision to publish in a nontraditional way.

In 2013, Hansen left his post at NASA to become a climate activist because, in his words, “as a government employee, you can’t testify against the government.” In a wide-ranging December 2013 study, conducted to support Our Children’s Trust, a group advancing legal challenges to lax greenhouse gas emissions policies on behalf of minors, Hansen called for a “human tipping point”—essentially, a social revolution—as one of the most effective ways of combating climate change, though he still favors a bilateral carbon tax agreed upon by the United States and China as the best near-term climate policy. In the new study, Hansen writes, “there is no morally defensible excuse to delay phase-out of fossil fuel emissions as rapidly as possible.”

Asked whether Hansen has plans to personally present the new research to world leaders, he said: “Yes, but I can’t talk about that today.” What’s still uncertain is whether, like with so many previous dire warnings, world leaders will be willing to listen.

*   *   *

Ice Melt, Sea Level Rise and Superstorms (Climate Sciences, Awareness and Solutions / Earth Institute, Columbia University)

23 July 2015

James Hansen

The paper “Ice melt, sea level rise and superstorms: evidence from paleoclimate data, climate modeling, and modern observations that 2°C global warming is highly dangerous” has been published in Atmospheric Chemistry and Physics Discussion and is freely available here.

The paper draws on a large body of work by the research community, as indicated by the 300 references. No doubt we missed some important relevant contributions, which we may be able to rectify in the final version of the paper. I thank all the researchers who provided data or information, many of whom I may have failed to include in the acknowledgments, as the work for the paper occurred over a several year period.

I am especially grateful to the Durst family for a generous grant that allowed me to work full time this year on finishing the paper, as well as the other supporters of our program Climate Science, Awareness and Solutions at the Columbia University Earth Institute.

In the conceivable event that you do not read the full paper plus supplement, I include the Acknowledgments here:

Acknowledgments. Completion of this study was made possible by a generous gift from The Durst Family to the Climate Science, Awareness and Solutions program at the Columbia University Earth Institute. That program was initiated in 2013 primarily via support from the Grantham Foundation for Protection of the Environment, Jim and Krisann Miller, and Gerry Lenfest and sustained via their continuing support. Other substantial support has been provided by the Flora Family Foundation, Dennis Pence, the Skoll Global Threats Fund, Alexander Totic and Hugh Perrine. We thank Anders Carlson, Elsa Cortijo, Nil Irvali, Kurt Lambeck, Scott Lehman, and Ulysses Ninnemann for their kind provision of data and related information. Support for climate simulations was provided by the NASA High-End Computing (HEC) Program through the NASA Center for Climate Simulation (NCCS) at Goddard Space Flight Center.

Climate models are even more accurate than you thought (The Guardian)

The difference between modeled and observed global surface temperature changes is 38% smaller than previously thought

Looking across the frozen sea of Ullsfjord in Norway.  Melting Arctic sea ice is one complicating factor in comparing modeled and observed surface temperatures.

Looking across the frozen sea of Ullsfjord in Norway. Melting Arctic sea ice is one complicating factor in comparing modeled and observed surface temperatures. Photograph: Neale Clark/Robert Harding World Imagery/Corbis

Global climate models aren’t given nearly enough credit for their accurate global temperature change projections. As the 2014 IPCC report showed, observed global surface temperature changes have been within the range of climate model simulations.

Now a new study shows that the models were even more accurate than previously thought. In previous evaluations like the one done by the IPCC, climate model simulations of global surface air temperature were compared to global surface temperature observational records like HadCRUT4. However, over the oceans, HadCRUT4 uses sea surface temperatures rather than air temperatures.

A depiction of how global temperatures calculated from models use air temperatures above the ocean surface (right frame), while observations are based on the water temperature in the top few metres (left frame). Created by Kevin Cowtan.

A depiction of how global temperatures calculated from models use air temperatures above the ocean surface (right frame), while observations are based on the water temperature in the top few metres (left frame). Created by Kevin Cowtan.

Thus looking at modeled air temperatures and HadCRUT4 observations isn’t quite an apples-to-apples comparison for the oceans. As it turns out, sea surface temperatures haven’t been warming fast as marine air temperatures, so this comparison introduces a bias that makes the observations look cooler than the model simulations. In reality, the comparisons weren’t quite correct. As lead author Kevin Cowtan told me,

We have highlighted the fact that the planet does not warm uniformly. Air temperatures warm faster than the oceans, air temperatures over land warm faster than global air temperatures. When you put a number on global warming, that number always depends on what you are measuring. And when you do a comparison, you need to ensure you are comparing the same things.

The model projections have generally reported global air temperatures. That’s quite helpful, because we generally live in the air rather than the water. The observations, by mixing air and water temperatures, are expected to slightly underestimate the warming of the atmosphere.

The new study addresses this problem by instead blending the modeled air temperatures over land with the modeled sea surface temperatures to allow for an apples-to-apples comparison. The authors also identified another challenging issue for these model-data comparisons in the Arctic. Over sea ice, surface air temperature measurements are used, but for open ocean, sea surface temperatures are used. As co-author Michael Mann notes, as Arctic sea ice continues to melt away, this is another factor that accurate model-data comparisons must account for.

One key complication that arises is that the observations typically extrapolate land temperatures over sea ice covered regions since the sea surface temperature is not accessible in that case. But the distribution of sea ice changes seasonally, and there is a long-term trend toward decreasing sea ice in many regions. So the observations actually represent a moving target.

A depiction of how as sea ice retreats, some grid cells change from taking air temperatures to taking water temperatures. If the two are not on the same scale, this introduces a bias.  Created by Kevin Cowtan.

A depiction of how as sea ice retreats, some grid cells change from taking air temperatures to taking water temperatures. If the two are not on the same scale, this introduces a bias. Created by Kevin Cowtan.

When accounting for these factors, the study finds that the difference between observed and modeled temperatures since 1975 is smaller than previously believed. The models had projected a 0.226°C per decade global surface air warming trend for 1975–2014 (and 0.212°C per decade over the geographic area covered by the HadCRUT4 record). However, when matching the HadCRUT4 methods for measuring sea surface temperatures, the modeled trend is reduced to 0.196°C per decade. The observed HadCRUT4 trend is 0.170°C per decade.

So when doing an apples-to-apples comparison, the difference between modeled global temperature simulations and observations is 38% smaller than previous estimates. Additionally, as noted in a 2014 paper led by NASA GISS director Gavin Schmidt, less energy from the sun has reached the Earth’s surface than anticipated in these model simulations, both because solar activity declined more than expected, and volcanic activity was higher than expected. Ed Hawkins, another co-author of this study, wrote about this effect.

Combined, the apparent discrepancy between observations and simulations of global temperature over the past 15 years can be partly explained by the way the comparison is done (about a third), by the incorrect radiative forcings (about a third) and the rest is either due to climate variability or because the models are slightly over sensitive on average. But, the room for the latter effect is now much smaller.

Comparison of 84 climate model simulations (using RCP8.5) against HadCRUT4 observations (black), using either air temperatures (red line and shading) or blended temperatures using the HadCRUT4 method (blue line and shading). The upper panel shows anomalies derived from the unmodified climate model results, the lower shows the results adjusted to include the effect of updated forcings from Schmidt et al. (2014).

Comparison of 84 climate model simulations (using RCP8.5) against HadCRUT4 observations (black), using either air temperatures (red line and shading) or blended temperatures using the HadCRUT4 method (blue line and shading). The upper panel shows anomalies derived from the unmodified climate model results, the lower shows the results adjusted to include the effect of updated forcings from Schmidt et al. (2014).

As Hawkins notes, the remaining discrepancy between modeled and observed temperatures may come down to climate variability; namely the fact that there has been a preponderance of La Niña events over the past decade, which have a short-term cooling influence on global surface temperatures. When there are more La Niñas, we expect temperatures to fall below the average model projection, and when there are more El Niños, we expect temperatures to be above the projection, as may be the case when 2015 breaks the temperature record.

We can’t predict changes in solar activity, volcanic eruptions, or natural ocean cycles ahead of time. If we want to evaluate the accuracy of long-term global warming model projections, we have to account for the difference between the simulated and observed changes in these factors. When the authors of this study did so, they found that climate models have very accurately projected the observed global surface warming trend.

In other words, as I discussed in my book and Denial101x lecture, climate models have proven themselves reliable in predicting long-term global surface temperature changes. In fact, even more reliable than I realized.

Denial101x climate science success stories lecture by Dana Nuccitelli.

There’s a common myth that models are unreliable, often based on apples-to-oranges comparisons, like looking at satellite estimates of temperatures higher in the atmosphere versus modeled surface air temperatures. Or, some contrarians like John Christy will only consider the temperature high in the atmosphere, where satellite estimates are less reliable, and where people don’t live.

This new study has shown that when we do an apples-to-apples comparison, climate models have done a good job projecting the observed temperatures where humans live. And those models predict that unless we take serious and immediate action to reduce human carbon pollution, global warming will continue to accelerate into dangerous territory.

There never was a global warming ‘pause,’ NOAA study concludes (Environment & Energy Publishing)

Gayathri Vaidyanathan, E&E reporter

Published: Friday, June 5, 2015

The global warming “pause” does not exist, according to scientists at the National Oceanic and Atmospheric Administration.

Their finding refutes a theory that has dominated climate science in recent years. The Intergovernmental Panel on Climate Change (IPCC) in 2013 found that global temperatures in recent years have not risen as quickly as they did in the 20th century. That launched an academic hunt for the missing heat in the oceans, volcanoes and solar rays. Meanwhile, climate deniers triumphantly crowed that global warming has paused or gone on a “hiatus.”

But it now appears that the pause never was. NOAA scientists have fixed some small errors in global temperature data and found that temperatures over the past 15 years have been rising at a rate comparable to warming over the 20th century. The study was published yesterday inScience.

That a minor change to the analysis can switch the outcome from a hiatus to increased warming shows “how fragile a concept it [the hiatus] was in the first place,” said Gavin Schmidt, director of the NASA Goddard Institute for Space Studies, who was unaffiliated with the study.

According to the NOAA study, the world has warmed since 1998 by 0.11 degree Celsius per decade. Scientists had previously calculated that the trend was about half that.

The new rate is equal to the rate of warming seen between 1951 and 1999.

There has been no slowdown in the rate of global warming, said Thomas Karl, director of NOAA’s National Centers for Environmental Information and lead author of the study.

“Global warming is firmly entrenched on our planet, and it continues to progress and is likely to continue to do so in the future unless emissions of greenhouse gases are substantially altered,” he said.

Errors from weather stations, buoys and buckets

That NOAA has to adjust temperature readings is not unusual. Many factors can affect raw temperature measurements, according to a study by Karl in 1988.

For instance, a weather station may be situated beneath a tree, which would bias temperatures low. Measurements made near a parking lot would read warm due to the waves of heat emanating from asphalt surfaces. NOAA and other agencies adjust the raw temperature data to remove such biases.

It has become clear in recent years that some biases still persist in the data, particularly of ocean temperatures. The culprit: buckets.

Ships traverse the world, and, occasionally, workers onboard dip a bucket over the hull and bring up water that they measure using a thermometer. The method is old school and error prone — water in a bucket is usually cooler than the ocean.

For a long time, scientists had assumed that most ships no longer use buckets and instead measure water siphoned from the ocean to cool ship engines. The latter method is more robust. But data released last year showed otherwise and compelled NOAA to correct for this bias.

A second correction involved sensor-laden buoys interspersed across the oceans whose temperature readings are biased low. Karl and his colleagues corrected for this issue, as well.

The corrections “made a significant impact,” Karl said. “They added about 0.06 degrees C per decade additional warming since 2000.”

The ‘slowdown hasn’t gone away’

What that means for the global warming hiatus depends on whom you ask. The warming trend over the past 15 years is comparable to the trend between 1950 and 1998 (a 48-year stretch), which led Karl to say that global warming never slowed.

Other scientists were not fully convinced. For a truly apples-to-apples comparison, the past 15 years should be compared with other 15-year stretches, said Peter Stott, head of the climate monitoring and attribution team at the U.K. Met Office.

For instance, the globe warmed more slowly in the past 15 years than between 1983 and 1998 (the previous 15-year stretch), even with NOAA’s new data corrections, Stott said.

“The slowdown hasn’t gone away,” he said in an email. “While the Earth continues to accumulate energy as a result of increasing man-made greenhouse gas emissions … global temperatures have not increased smoothly.”

The disagreements arise because assigning trends — including the trend of a “hiatus” — to global warming depends on the time frame of reference.

“Trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends,” the IPCC stated in 2013, even as it discussed the pause.

Robert Kaufmann, an environment professor at Boston University who was unaffiliated with the study, called trends a “red herring.”

A trend implies that the planet will warm, decade after decade, at a steady clip. There is no reason why that should be the case, Kaufmann said. Many factors — human emissions of warming and cooling gases, natural variability, and external factors such as the sun — feed into Earth’s climate. The relative contributions of each factor can vary by year, decade, century or on even larger time scales.

“There is no scientific basis to assume that the climate is going to warm at the same rate year after year, decade after decade,” he said.

Copying the language of skeptics

Trends are a powerful weapon in the hands of climate deniers. As early as 2006, deniers used the slowdown of warming from 1998 onward to say that global warming had stopped or paused.

The idea of a “pause” seeped into academia, launching dozens of studies into what might have caused it. But there was a subtle difference between scientists’ understanding of the pause and that of the skeptics; scientists never believed that warming had stopped, only that it had slowed compared with the rapidly warming ’90s. They wanted to know why.

Over the years, scientists have unraveled the contributions of volcanoes to global cooling, the increased uptake of heat by the Pacific Ocean, the cooling role of La Niñas and other drivers of natural variability. Their understanding of our planet’s climate evolved rapidly.

As scientists wrote up their findings, they unwittingly adopted the skeptics’ language of the “pause,” said Stephan Lewandowsky, a psychologist at the University of Bristol who was unaffiliated with the NOAA study. That was problematic.

“That’s sort of a subtle semantic thing, but it is really important because it suggests that these [scientists] bought into the existence of the hiatus,” he said.

Then, in 2013, the IPCC wrote about the pause. The German government complained that the term implies that warming had stopped, which is inaccurate. The objection was ignored.

NOAA’s strong refutation of the hiatus is particularly weighty because it comes from a government lab, and the work was headed by Karl, a pioneer of temperature reanalysis studies.

NOAA will be using the data corrections to assess global temperatures from July onward, Karl said. NASA is discussing internally whether to apply the fixes suggested in the study, according to Schmidt of NASA.

The study was greeted by Democrats in Congress as proof that climate change is real. Sen. Barbara Boxer (D-Calif.), ranking member of the Environment and Public Works Committee, used it as an opportunity to chide her opponents.

“Climate change deniers in Congress need to stop ignoring the fact that the planet may be warming at an even faster rate than previously observed, and we must take action now to reduce dangerous carbon pollution,” she said in a statement.

Cemaden faz nova projeção da reserva do Cantareira no período de seca (MCTI/Cemaden)

Levantamento do Centro Nacional de Monitoramento e Alertas de Desastres Naturais indica chuvas e reservas abaixo da média histórica até dezembro

O Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden/MCTI) aponta no último relatório, publicado na quarta-feira (27), as situações críticas do Reservatório do Sistema Cantareira, indicando chuvas e reservas abaixo da média histórica, até dezembro deste ano.

Essa situação ocorrerá mesmo com a inclusão dos dados da diminuição da captação de água do reservatório, prevista para os meses de setembro até novembro, anunciada pelo Comunicado Conjunto da Agência Nacional de Água (ANA) e do Departamento de Águas e Energia Elétrica (DAEE), na última semana de maio.

Com base nas redes pluviométricas do Cemaden e do DAEE, cobrindo as sub-bacias de captação do Sistema Cantareira, durante o período de outubro de 2014 a março de 2015, a precipitação média espacial acumulada foi de 879 milímetros (mm), equivalente a 73,5% da média climatológica, registrada em 1.161 mm para o mesmo período.

A precipitação média espacial acumulada no mês de abril de 2015 foi de 52,4 mm, representando 58,4% da média climatológica do mês, registrado em 89,83 mm. A chuva acumulada no período de 1º até 29 de maio de 2015 foi registrada com uma precipitação média de 55,3 mm, que representa 70,7% do total de chuvas da média histórica do mesmo período, registrada em 78,2 mm. No relatório, também são indicados os valores da precipitação média dos dados da Companhia de Saneamento Básico do Estado de São Paulo (Sabesp), que têm algumas variações com relação aos dados do Cemaden.

Na situação atual, a vazão média do Sistema Cantareira, ou seja, o cálculo entre o volume de água e o seu reabastecimento com as chuvas, está abaixo da média climatológica. A vazão média afluente ao Sistema Cantareira no mês de maio foi de 14,02 metros cúbicos por segundo (m3/s), ou seja, 63,4% abaixo da vazão média mensal de 38,27 m3/s. Também está abaixo da vazão mínima histórica de 19,90 m3/s, representando apenas 29,5% do total da média histórica.

Projeções

O relatório do cenário hídrico do Sistema Cantareira, divulgado, periodicamente, desde janeiro de 2015, tem os cálculos das projeções da vazão afluente no modelo hidrológico, implementado pelo Cemaden, com base na previsão de chuva do Centro de Previsão de Tempo e Estudos Climáticos (CPTEC) do Inpe para sete dias. A partir do oitavo dia, são apresentadas projeções com base em cinco cenários de chuvas (na média histórica, 25% e 50% abaixo e acima da média). Finalmente, considerando um cenário de extração ou captação de água do Sistema Cantareira são obtidas as projeções da evolução do armazenamento.

No último relatório, considerou-se a extração total do Sistema Cantareira igual a 17,0 m³ por segundo no período de 1º de junho a 31 de agosto e também no mês de dezembro de 2015. No período de 1º de setembro a 30 de novembro, considerou-se a captação de água dos reservatórios igual a 13,5 m³ por segundo.

No cenário de precipitações pluviométricas na média climatológica, no final da estação seca, início de outubro, o volume armazenado seria de 188,66 milhões de m3, aproximadamente. “Esse volume armazenado representa 14,9% da reserva total do Cantareira, ou seja, a soma do volume útil e os dois volumes mortos, com o total estimado em 1.269,5 milhões de m³”, destaca a hidróloga do Cemaden Adriana Cuartas, responsável pelo relatório do Cantareira.

Nesse cenário de precipitações dentro da média histórica, no dia 1º de dezembro de 2015, o volume armazenado seria, aproximadamente, de 227,72 milhões de m³, que representaria 17,9% do volume da reserva total do Cantareira.

Para um cenário de precipitações pluviométricas iguais à média climatológica, o chamado volume morto 1 seria recuperado ao longo da última semana de dezembro, aproximadamente. Considerando o cenário de chuvas 25% acima da média climatológica, o volume morto 1 seria recuperado na última semana de novembro.

Acesse o documento.

(MCTI, via Cemaden)

Sabesp faz investimento milionário em questionada técnica para fazer chover (UOL)

Thamires Andrade*

Do UOL, em São Paulo

28/05/201512h09

Até o fim deste ano, a Sabesp terá repassado R$ 12,5 milhões sem ter feito uma licitação

Até o fim deste ano, a Sabesp terá repassado R$ 12,5 milhões sem ter feito uma licitação (Lucas Lacaz Ruiz/Estadão Conteúdo)

Enquanto alega necessidade de “garantir o equilíbrio econômico-financeiro” para justificar a alta na conta de água, a Sabesp (Companhia de Saneamento Básico do Estado de São Paulo) mantém um negócio de mais de R$ 8 milhões com a ModClima, uma empresa que oferece uma técnica de indução de chuvas artificiais. Especialistas ouvidos pelo UOL dizem, porém, que o método não é eficaz.

De acordo com documentos da Sabesp obtidos via Lei de Acesso à Informação, a companhia já fechou quatro contratos com a empresa. Nos dois mais recentes, assinados no ano passado, a Sabesp já pagou R$ 2,4 milhões de um total de R$ 8,1 milhões previstos para fazer chover nos sistemas Cantareira e Alto Tietê, os mais afetados pela crise da água na região metropolitana de São Paulo.

Nos dois anteriores, com vigência 2007/2008 e 2009/2013, respectivamente, foram repassados R$ 4,3 milhões — já somados os reajustes. Desde 2007, portanto, a ModClima recebeu quase R$ 7 milhões da Sabesp.

Até o fim deste ano, a Sabesp terá repassado R$ 12,5 milhões sem ter feito nenhum tipo de contrato de licitação. A empresa alega que não era necessário abrir esse processo, pois a ModClima possui “patente de tecnologia utilizada”. Ou seja, ela seria a única empresa detentora desse tipo de tecnologia e, consequentemente, a única capaz de prestar o serviço.

Para o professor livre-docente do IAG-USP (Instituto de Astronomia, Geofísica e Ciências Atmosféricas da Universidade de São Paulo) Augusto Jose Pereira Filho, a Sabesp contratou a empresa para não ser acusada de não fazer nada diante da crise de abastecimento de água.

“Foi dinheiro jogado fora. Era melhor utilizar essa verba para outros objetivos, como campanhas de conscientização e redução de perda de água, do que usar em técnicas que ainda não têm comprovação científica”, afirma.

A técnica

A tecnologia, utilizada pela ModClima, é chamada de semeadura e é realizada com um avião que lança gotículas de água dentro da nuvem para acelerar sua precipitação.

As gotas ganham volume e, quando estão pesadas o suficiente, a chuva localizada acontece. Segundo a empresa, chove de 5 a 40 milímetros. O tempo de semeadura dura entre 20 e 40 minutos.

“A semeadura consiste em imitar o processo de crescimento dos hidrometeoros [meteoros aquosos] que, quando atingem o tamanho correto dentro da nuvem, provocam a precipitação. Um avião lança dentro da nuvem gotículas de gelo, cristais ou outra partícula – de acordo com o tipo desta nuvem [quente ou fria] – para acelerar o início da chuva, mas para isso é necessário estar no lugar certo e na hora certa”, explica o professor Carlos Augusto Morales Rodriguez, do Departamento de Ciências Atmosféricas do IAG-USP.

A nuvem deve ter uma densidade adequada para que ocorra a precipitação, mas, segundo Rodriguez, a meteorologia tem dificuldades para identificar as nuvens em condições para a efetivação do processo.

“O radar meteorológico usado pela empresa contratada pela Sabesp não é capaz de identificar a nuvem que está em processo de precipitação, mas, sim, as nuvens que já estão chovendo. Portanto a técnica da empresa é ineficaz, já que, quando o avião entra na nuvem, ela já está chovendo”, explica Rodriguez.

Rodriguez afirma ainda que a empresa fez a semeadura no sistema Cantareira como se o local tivesse nuvens do tipo quente. “O Estado de São Paulo é composto por nuvens frias e, para acelerar a precipitação, era necessário uma técnica adequada para esta região, como o uso de iodeto de prata e gelo seco”, explica.

Tanto Rodriguez quanto Pereira Filho fizeram avaliações independentes do trabalho da empresa e concluíram que a técnica não tinha a eficácia desejada.

“Em uma avaliação de 2003/2004 constatamos que a técnica não funcionou, mas mesmo assim a Sabesp contratou a empresa novamente”, diz Filho. “Fui convidado pelo diretor da Sabesp para conversar com os representantes da ModClima e, durante a reunião, os relatos eram descabidos do ponto de vista científico.”

Ele também questiona os resultados da técnica no ano passado. De acordo com o documento da Sabesp obtido via Lei de Acesso à Informação, só no ano passado a técnica induziu precipitação de 25 hm³ (hectômetro cúbico, o equivalente a 25 bilhões de litros) no sistema Cantareira e 6 hm³ no sistema Alto Tietê (equivalente a 6 bilhões de litros).

“Relatos da Sabesp diziam que houve aumento de 30% de chuvas nos sistemas por causa da técnica, mas a porcentagem e os resultados são duvidosos, pois não é fácil medir de que maneira a semeadura contribuiu de fato para aumentar a precipitação local”, argumenta Filho.

Procurada, a empresa ModClima informou que sua comunicação atual está concentrada na Sabesp e que não responderia as perguntas da reportagem.

A Sabesp não indicou nenhum representante para explicar a contratação dos serviços para provocar chuvas artificiais nem respondeu questões complementares enviadas pelo UOL. *Com colaboração de Wellington Ramalhoso

Extending climate predictability beyond El Niño (Science Daily)

Date: April 21, 2015

Source: University of Hawaii – SOEST

Summary: Tropical Pacific climate variations and their global weather impacts may be predicted much further in advance than previously thought, according to research by an international team of climate scientists. The source of this predictability lies in the tight interactions between the ocean and the atmosphere and among the Atlantic, the Pacific and the Indian Oceans. Such long-term tropical climate forecasts are useful to the public and policy makers, researchers say.


This image shows inter-basin coupling as a cause of multi-year tropical Pacific climate predictability: Impact of Atlantic warming on global atmospheric Walker Circulation (arrows). Rising air over the Atlantic subsides over the equatorial Pacific, causing central Pacific sea surface cooling, which in turn reinforces the large-scale wind anomalies. Credit: Yoshimitsu Chikamoto

Tropical Pacific climate variations and their global weather impacts may be predicted much further in advance than previously thought, according to research by an international team of climate scientists from the USA, Australia, and Japan. The source of this predictability lies in the tight interactions between the ocean and the atmosphere and among the Atlantic, the Pacific and the Indian Oceans. Such long-term tropical climate forecasts are useful to the public and policy makers.

At present computer simulations can predict the occurrence of an El Niño event at best three seasons in advance. Climate modeling centers worldwide generate and disseminate these forecasts on an operational basis. Scientists have assumed that the skill and reliability of such tropical climate forecasts drop rapidly for lead times longer than one year.

The new findings of predictable climate variations up to three years in advance are based on a series of hindcast computer modeling experiments, which included observed ocean temperature and salinity data. The results are presented in the April 21, 2015, online issue of Nature Communications.

“We found that, even three to four years after starting the prediction, the model was still tracking the observations well,” says Yoshimitsu Chikamoto at the University of Hawaii at Manoa International Pacific Research Center and lead author of the study. “This implies that central Pacific climate conditions can be predicted over several years ahead.”

“The mechanism is simple,” states co-author Shang-Ping Xie from the University of California San Diego. “Warmer water in the Atlantic heats up the atmosphere. Rising air and increased precipitation drive a large atmospheric circulation cell, which then sinks over the Central Pacific. The relatively dry air feeds surface winds back into the Atlantic and the Indian Ocean. These winds cool the Central Pacific leading to conditions, which are similar to a La Niña Modoki event. The central Pacific cooling then strengthens the global atmospheric circulation anomalies.”

“Our results present a paradigm shift,” explains co-author Axel Timmermann, climate scientist and professor at the University of Hawaii. “Whereas the Pacific was previously considered the main driver of tropical climate variability and the Atlantic and Indian Ocean its slaves, our results document a much more active role for the Atlantic Ocean in determining conditions in the other two ocean basins. The coupling between the oceans is established by a massive reorganization of the atmospheric circulation.”

The impacts of the findings are wide-ranging. “Central Pacific temperature changes have a remote effect on rainfall in California and Australia. Seeing the Atlantic as an important contributor to these rainfall shifts, which happen as far away as Australia, came to us as a great surprise. It highlights the fact that on multi-year timescales we have to view climate variability in a global perspective, rather than through a basin-wide lens,” says Jing-Jia Luo, co-author of the study and climate scientist at the Bureau of Meteorology in Australia.

“Our study fills the gap between the well-established seasonal predictions and internationally ongoing decadal forecasting efforts. We anticipate that the main results will soon be corroborated by other climate computer models,” concludes co-author Masahide Kimoto from the University of Tokyo, Japan.

Journal Reference:

  1. Yoshimitsu Chikamoto, Axel Timmermann, Jing-Jia Luo, Takashi Mochizuki, Masahide Kimoto, Masahiro Watanabe, Masayoshi Ishii, Shang-Ping Xie, Fei-Fei Jin. Skilful multi-year predictions of tropical trans-basin climate variabilityNature Communications, 2015; 6: 6869 DOI: 10.1038/ncomms7869

Software tool allows scientists to correct climate ‘misinformation’ from major media outlets (ClimateWire)

ClimateWire, April 13, 2015.

Manon Verchot, E&E reporter
Published: Monday, April 13, 2015
After years of misinformation about climate change and climate science in the media, more than two dozen climate scientists are developing a Web browser plugin to right the wrongs in climate reporting.

The plugin, called Climate Feedback and developed by Hypothes.is, a nonprofit software developer, allows researchers to annotate articles in major media publications and correct errors made by journalists.

“People’s views about climate science depend far too much on their politics and what their favorite politicians are saying,” said Aaron Huertas, science communications officer at the Union of Concerned Scientists. “Misinformation hurts our ability to make rational decisions. It’s up to journalists to tell the public what we really know, though it can be difficult to make time to do that, especially when covering breaking news.”

An analysis from the Union of Concerned Scientists found that levels of inaccuracy surrounding climate change vary dramatically depending on the news outlet. In 2013, 72 percent of climate-related coverage on Fox News contained misleading statements, compared to 30 percent on CNN and 8 percent on MSNBC.

Through Climate Feedback, researchers can comment on inaccurate statements and rate the credibility of articles. The group focuses on annotating articles from news outlets it considers influential — like The Wall Street Journal or The New York Times — rather than blogs.

“When you read an article it’s not just about it being wrong or right — it’s much more complicated than that,” said Emmanuel Vincent, a climate scientist at the University of California, Merced’s Center for Climate Communication, who developed the idea behind Climate Feedback. “People still get confused about the basics of climate change.”

‘It’s crucial in a democracy’

According to Vincent, one of the things journalists struggle with most is articulating the effect of climate change on extreme weather events. Though hurricanes or other major storms cannot be directly attributed to climate change, scientists expect warmer ocean temperatures and higher levels of water vapor in the atmosphere to make storms more intense. Factors like sea-level rise are expected to make hurricanes more devastating as higher sea levels allow storm surges to pass over existing infrastructure.

“Trying to connect a weather event with climate change is not the best approach,” Vincent said.

Climate Feedback hopes to clarify issues like these. The group’s first task was annotating an article published inThe Wall Street Journal in September 2014.

In the piece, the newspaper reported that sea-level rise experienced today is the same as sea-level rise experienced 70 years ago. But in the annotated version of the story, Vincent pointed to research from Columbia University that directly contradicted that idea.

“The rate of sea level rise has actually quadrupled since preindustrial times,” wrote Vincent in the margins.

Vincent hopes that tools like Climate Feedback can help journalists learn to better communicate climate research and can make members of the public confident that the information they are receiving is credible.

Researchers who want to contribute to Climate Feedback are required to have published at least one climate-related article that passed a peer review. Many say these tools are particularly important in the Internet era, when masses of information make it difficult for the public to wade through the vast quantities of articles and reports.

“There are big decisions that need to be made about climate change,” Vincent said. “It’s crucial in a democracy for people to know about these issues.”

A região mexicana que acredita ser protegida por ETs (BBC)

15 abril 2015

BBC Mundo

Muitos moradores de Tampico e Ciudad Madero acreditam que a costa em frente à praia Miramar é o melhor local para se avistar ETs

Sentado num sofá de uma cafeteria simples de Ciudad Madero, um homem me convida a meditar para ver óvnis.

A televisão exibe Bob Marley cantando I Shot the Sheriff e, atrás do balcão, uma mulher prepara um frappuccino.

A cidade fica no violento Estado de Tamaulipas, nordeste do México, e muitos acreditam que os extraterrestres passaram décadas a protegendo de furacões.

Isto porque, quando os furacões que ocorrem na região avançam com força até a costa, onde fica a cidade, eles parariam de forma abrupta e misteriosa, mudando de direção, de acordo com os habitantes mais crentes.

Moradores dizem que já viram os alienígenas, outros afirmam que há uma base submarina a cerca de 40 quilômetros da costa e que já viram suas naves, esferas, triângulos e luzes.

Thinkstock

Aliens são um assunto falado abertamente nesta região do México

E todos conversam abertamente sobre o assunto.

O engenheiro civil Fernando Alonso Gallardo, 68 anos, aposentado da petroleira estatal Pemex e empresário, tem o rosto queimado pelo sol da praia local, Miramar, uma faixa de areia de dez quilômetros.

Pelas janelas do restaurante de Gallardo, o El Mexicano, que fica na praia, entra uma brisa do Golfo do México.

Gallardo conta sua história à BBC Mundo, o serviço em espanhol da BBC. A dele, como a de muitos em Ciudad Madero, envolve avistamentos de objetos voadores não identificados.

BBC Mundo

Furacões em 1933 e 1955 destruíram o restaurante da família de Alonso

Em 1933, quando os furacões ainda não tinham nome, um da categoria 5 chegou a Tampico, onde Gallardo nasceu, perto de Ciudad Madero. O furacão destruiu o restaurante de seu pai, mas a família construiu outro.

Em 1955 o furacão Hilda, que inundou três quartos da cidade e deixou 20 mil desabrigados, voltou a atingir a região.

“Acho que nesta época não havia extraterrestres, se houvesse, não teria tantos desastres”, diz Gallardo.

Furacões também ocorreram em 1947, 1951 e 1966. Mas, logo, as tempestades pararam de atingir a região.

Investigadores acreditam que o verdadeiro motivo do desvio dos furacões é a presença de correntes de água fria na área. Mas, nas vizinhas Tampico e Ciudad Madero, ninguém ignora a crença de que algo sobrenatural defenderia a região.

Avistamentos

Entre o século 19 e os anos 1970, quando as pessoas viam objetos luminosos no céu, diziam que eram bruxas.

Em 1967, foi construído um monumento à Virgem de Carmen – padroeira do mar e dos marinheiros – no local por onde passam pescadores quando deixam o rio Pánuco, que divide os Estados de Tamaulipas e Veracruz.

Muitos viam aí a explicação para o desaparecimento de furacões.

Até hoje, é uma tradição que marinheiros façam o sinal da cruz diante da estátua e capitães buzinem suas embarcações, disse Marco Flores, que desde 1995 é cronista oficial do governo da cidade de Tampico.

A teoria marciana chegou pouco depois.

BBC Mundo

Muitos acreditam que são os ETs que protegem a região de furacões

Segundo Flores, ela foi trazida por um homem da Cidade do México que chegou a Tampico por volta dos anos 1970 a trabalho, e garantiu que mais do que proteger a cidade, os extraterrestres que haviam entrado em contato com ele guardavam suas bases submarinas.

Alonso Gallardo concorda. “Não é um esforço para proteger a cidade, é um esforço para proteger a cidade onde eles vivem, porque eles encontraram uma maneira de estar lá”.

Gallardo diz ter visto seu primeiro óvni em 1983: um disco de 60 metros de diâmetro com luzes amareladas. Isso ocorreu no final do calçadão que serve para separar a água verde do Golfo do México da água escura do rio Pánuco.

Ali, dizem os que acreditam, é o melhor lugar para se ver os objetos.

‘Falta de inteligência’

O ponto de encontro dos “crentes” era um café no Walmart, mas a mulher que os atendia não parecia confortável com o tópico da conversa. Assim, os membros da Associação de Investigação Científica Óvni de Tampico se mudaram para o restaurante Bambino de Ciudad Madero.

Ali, cada um espera para narrar suas experiências.

BBC Mundo

José Luis Cárdenas tira fotos do céu, nas quais aparecem luzes estranhas

Na cabeceira da mesa, Eduardo Ortiz Anguiano, 83 anos, fala sobre seu livro publicado no ano passado, De Ovnis, fantasmas e outros eventos extraordinários.

Durante três anos, ele coletou mais de 100 depoimentos e se convenceu: “Duvidar da existência de óvnis é não ter inteligência”.

E muitos concordam. Eva Martínez diz que a presença de extraterrestres lhe dá paz.

José Luis Cárdenas tem várias fotografias nas quais se vê luzes com formas estranhas – luzes que não estão no céu no momento da foto mas que aparecem no visor da câmera, segundo ele.

“Se os seres que nos visitam não nos machucam, então estão nos protegendo, estão fazendo algo por nós. E é assim que temos que ver as coisas”, disse.

A última vez que um furacão que dirigia-se para a área de Tampico se desviou foi em 2013.

Naquele ano, autoridades locais colocaram o busto de um marciano na praia de Miramar (que foi roubado logo depois) e declararam que na última terça-feira de outubro seria celebrado o Dia do Marciano.

“A explicação que não podemos dar cientificamente damos de maneira mágica. As pessoas desta região têm um pensamento mágico”, diz Flores, o cronista de Tampico.

‘Deus gosta de Tampico’

No sofá da cafeteria de Ciudad Madero, Juan Carlos Ramón López Díaz, presidente da associação de pesquisadores de óvnis, pede para que eu feche os olhos e mantenha a mente tranquila.

Ele me convida a ver um objeto luminoso no qual posso entrar, se eu quiser.

Atrás do balcão, ligam o liquidificador. Abro os olhos. Apesar da ajuda de López Díaz, não vi nada.

Welcome to Global Warming’s Terrifying New Era (Slate)

By Eric Holthaus

19 March 2015

466467728-in-this-handout-image-provided-by-unicef-the-storm

Storm damage in Port Vila, Vanuatu. Photo by UNICEF via Getty Images

On Wednesday, the National Oceanic and Atmospheric Administration announcedthat Earth’s global temperature for February was among the hottest ever measured. So far, 2015 is tracking above record-warm 2014—which, when combined with the newly resurgent El Niño, means we’re on pace for another hottest year in history.

In addition to the just-completed warmest winter on record globally (despite the brutal cold and record snow in the eastern U.S.), new data on Thursday from the National Snow and Ice Data Center show that this year’s peak Arctic sea ice reached its lowest ever maximum extent, thanks to “an unusual configuration of the jet stream” that greatly warmed the Pacific Ocean near Alaska.

But here’s the most upsetting news. It’s been exactly 30 years since the last time the world was briefly cooler than its 20th-century average. Every single month since February 1985 has been hotter than the long-term average—that’s 360 consecutive months.

More than just being a round number, the 30-year streak has deeper significance. In climatology, a continuous 30-year stretch of data is traditionally what’s used to define what’s “normal” for a given location. In a very real way, we can now say that for our given location—the planet Earth—global warming is now “normal.” Forget debating—our climate has officially changed.

This 30-year streak should change the way we think and talk about this issue. We’ve entered a new era in which global warming is a defining characteristic and a fundamental driver of what it means to be an inhabitant of planet Earth. We should treat it that way. For those who care about the climate, that may mean de-emphasizing statistics and science and beginning to talk more confidently about the moral implications of continuing on our current path.

Since disasters disproportionately impact the poor, climate change is increasingly an important economic and social justice issue. The pope will visit the United States later this year as part of a broader campaign by the Vatican to directly influence the outcome of this year’s global climate negotiations in Paris—recent polling data show his message may be resonating, especially with political conservatives and nonscience types. Two-thirds of Americans now believe that world leaders are morally obligated to take steps to reduce carbon.

Scientists and journalists have debated the connection between extreme weather and global warming for years, but what’s happening now is different. Since weather impacts virtually every facet of our lives (at least in a small way), and since climate change is affecting weather at every point in the globe every day (at least in a small way), that makes it at the same time incredibly difficult to study and incredibly important. Formal attribution studies that attempt to scientifically tease out whether global warming “caused” individual events are shortsighted and miss the point. It’s time for a change in tack. The better question to ask is: How do we as a civilization collectively tackle the weather extremes we already face?

In the aftermath of the nearly unprecedented power and destructive force of Cyclone Pam’s landfall in the remote Pacific island nation of Vanuatu—where survivors were forced to drink saltwater—emerges perhaps the best recent example I’ve seen of a government acknowledging this changed climate in a scientifically sound way:

Cyclone Pam is a consequence of climate change since all weather is affected by the planet’s now considerably warmer climate. The spate of extreme storms over the past decade—of which Pam is the latest—is entirely consistent in science with the hottest ever decade on record.

The statement was from the government of the Philippines, the previous country to suffer a direct strike by a Category 5 cyclone—Haiyan in 2013. As chair of the Climate Vulnerable Forum negotiating bloc, the Philippines also called for a strengthening of ambition in the run-up to this year’s global climate agreement in Paris.

The cost of disasters of all types is rising around the globe as population and wealth increase and storms become more fierce. This week in Japan, 187 countries agreed on a comprehensive plan to reduce loss of life from disasters as well as their financial impact. However, the disaster deal is nonbinding and won’t provide support to the most vulnerable countries.

Combining weather statistics and photos of devastated tropical islands with discussions of political and economic winners and losers is increasingly necessary as climate change enters a new era. We’re no longer describing the problem. We’re telling the story of how humanity reacts to this new normal.

As the Guardian’s Alan Rusbridger, in an editorial kickoff of his newspaper’s newly heightened focus on climate, said, “the mainstream argument has moved on.” What’s coming next isn’t certain, but it’s likely to be much more visceral and real than steadily upward sloping lines on a graph.

A Major Surge in Atmospheric Warming Is Probably Coming in the Next Five Years (Motherboard)

Written by NAFEEZ AHMED

March 2, 2015 // 08:25 PM CET

Forget the so-called ‘pause’ in global warming—new research says we might be in for an era of deeply accelerated heating.

While the rate of atmospheric warming in recent years has, indeed, slowed due to various natural weather cycles—hence the skeptics’ droning on about “pauses”—global warming, as a whole, has not stopped. Far from it. It’s actually sped up, dramatically, as excess heat has absorbed into the oceans. We’ve only begun to realize the extent of this phenomenon in recent years, after scientists developed new technologies capable of measuring ocean temperatures with a depth and precision that was previously lacking.

In 2011, a paper in Geophysical Research Letters tallied up the total warming data from land, air, ice, and the oceans. In 2012, the lead author of that study, oceanographer John Church, updated his research. What Church found was shocking: in recent decades, climate change has been adding on average around 125 trillion Joules of heat energy to the oceans per second.

How to convey this extraordinary fact? His team came up with an analogy: it was roughly the same amount of energy that would be released by the detonation of two atomic bombs the size dropped on Hiroshima. In other words, these scientists found that anthropogenic climate is warming the oceans at a rate equivalent to around two Hiroshima bombs per second. But as new data came in, the situation has looked worse: over the last 17 years, the rate of warming has doubled to about four bombs per second. In 2013, the rate of warming tripled to become equivalent to 12 Hiroshima bombs every second.

So not only is warming intensifying, it is also accelerating. By burning fossil fuels, humans are effectively detonating 378 million atomic bombs in the oceans each year—this, along with the ocean’s over-absorption of carbon dioxide, has fuelled ocean acidification, and now threatens the entire marine food chain as well as animals who feed on marine species. Like, er, many humans.

According to a new paper from a crack team of climate scientists, a key reason that the oceans are absorbing all this heat in recent decades so well (thus masking the extent of global warming by allowing atmospheric average temperatures to heat more slowly), is due to the Pacific Decadal Oscillation (PDO), an El Nino-like weather pattern that can last anywhere between 15-30 years.

In its previous positive phase, which ran from around 1977 to 1998, the PDO meant the oceans would absorb less heat, thus operating as an accelerator on atmospheric temperatures. Since 1998, the PDO has been in a largely negative phase, during which the oceans absorb more heat from the atmosphere.

Such decadal ocean cycles have broken down recently, and become more sporadic. The last, mostly negative phase, was punctuated by a brief positive phase that lasted 3 years between 2002 and 2005. The authors of the new study, Penn State climatologist Michael Mann, University of Minnesota geologist Byron Steinman, and Penn State meteorologist Sonya Miller, point out that the PDO, as well as the Atlantic Multidecadal Oscillation (AMO), have thus played a major role in temporarily dampening atmospheric warming.

“In other words, the ‘slowdown’ is fleeting and will likely soon disappear.”

So what has happened? During this period, Mann and his team show, there has been increased “heat burial” in the Pacific ocean, that is, a greater absorption of all that heat equivalent to hundreds of millions of Hiroshimas. For some, this has created the false impression, solely from looking at global average surface air temperatures, of a ‘pause’ in warming. But as Mann said, the combination of the AMO and PDO “likely offset anthropogenic warming over the past decade.”

Therefore, the “pause” doesn’t really exist, and instead is an artifact of the limitations of our different measuring instruments.

“The ‘false pause’ is explained in part by cooling in the Pacific ocean over the past one-to-two decades,” Mann told me, “but that is likely to reverse soon: in other words, the ‘slowdown’ is fleeting and will likely soon disappear.”

The disappearance of the ‘slowdown’ will, in tangible terms, mean that the oceans will absorb less atmospheric heat. While all the accumulated ocean heat “is certainly not going to pop back out,” NASA’s chief climate scientist Dr. Gavin Schmidt told me, it is likely to mean that less atmospheric heat will end up being absorbed. “Ocean cycles can modulate the uptake of anthropogenic heat, as some have speculated for the last decade or so, but… net flux is still going to be going into the ocean.”

According to Mann and his team, at some point, this will manifest as an acceleration in the rise of global average surface air temperatures. In their Science study, they observe: “Given the pattern of past historical variation, this trend will likely reverse with internal variability, instead adding to anthropogenic warming in the coming decades.”

So at some point in the near future, the PDO will switch from its current negative phase back to positive, reducing the capacity of the oceans to accumulate heat from the atmosphere. That positive phase of the PDO will therefore see a rapid rise in global surface air temperatures, as the oceans’ capacity to absorb all those Hiroshima bomb equivalents declines—and leaves it to accumulate in our skies. In other words, after years of slower-than-expected warming, we may suddenly feel the heat.

So when will that happen? No one knows for sure, but at the end of last year, signs emerged that the phase shift to a positive PDO could be happening right now.

In the five months before November 2014, measures of surface temperature differences in the Pacific shifted to positive, according to the National Oceanic and Atmospheric Administration. This is the longest such positive shift detected in about 12 years. Although too soon to determine for sure whether this is, indeed, the beginning of the PDO’s switch to a new positive phase, this interpretation is consistent with current temperature variations, which during a positive PDO phase should be relatively warm in the tropical Pacific and relatively cool in regions north of about 20 degrees latitude.

In January 2015, further signs emerged that the PDO is right now in transition to a new warm phase. “Global warming is about the get a boost,” ventured meteorologist Eric Holthaus. Recent data including California’s intensifying drought and sightings of tropical fish off the Alaskan coast “are further evidence of unusual ocean warming,” suggesting that a PDO transition “may already be underway a new warm phase.”

While it’s still not clear whether the PDO is really shifting into a new phase just yet, when it does, it won’t be good. Scientists from the UK Met Office’s Hadley Center led by Dr. Chris Roberts of the Oceans and Cryosphere Group estimate in a new paper in Nature that there is an 85 percent chance the faux ‘pause’ will end in the next five years, followed by a burst of warming likely to consist of a decade or so of warm ocean oscillations.

Roberts and his team found that a “slow down” period is usually (60 percent of the time) followed by rapid warming at twice the background rate for at least five years, and potentially longer. And mostly, this warming would be concentrated in the Arctic, a region where temperatures are already higher than the global average, and which is widely recognized to be a barometer of the health of the global climate due to how Arctic changes dramatically alter trends elsewhere. Recent extreme weather events around the world have been attributed to the melting Arctic ice sheets and the impact on ocean circulations and jet streams.

What this means, if the UK Met Office is right, is that we probably have five years (likely less) before we witness a supercharged surge of rapid global warming that could last a decade, further destabilizing the climate system in deeply unpredictable ways.

Previsão do tempo no Sudeste é uma dor de cabeça para cientistas (Folha de S.Paulo)

15/02/2015

Peculiaridades do clima regional tornam difícil saber como ficará o nível do Cantareira mesmo no curto prazo

Área está sujeita à influência de vários fatores complexos, como umidade da Amazônia e frentes frias da Antártida

REINALDO JOSÉ LOPES

COLABORAÇÃO PARA A FOLHA

LUCAS VETTORAZZO

DO RIO

Se a sucessão de boas e más notícias sobre a chuva que abastece os reservatórios de São Paulo parece uma confusão só, não se preocupe: previsões climáticas sobre o Sudeste brasileiro podem confundir até especialistas.

Isso acontece porque a região mais populosa do Brasil ocupa uma área do globo terrestre que recebe todo tipo de influência complexa, desde a umidade oriunda da Amazônia até as frentes frias “sopradas” da Antártida.

Resultado: um nível de incerteza acima do normal numa seara que, por natureza, já é bastante incerta.

“Isso vale principalmente para prever o clima, ou seja, as variações de médio e longo prazo, mas também é verdade, ainda que em grau bem menor, para as previsões de tempo, ou seja, na escala de dias”, diz Tercio Ambrizzi, climatologista da USP.

Portanto, não é que o tempo seja mais instável na área do sistema Cantareira, o mais castigado pela atual crise e agora em ligeira recuperação. O que ocorre é que a região que abastece o Cantareira às vezes pode ficar mais sujeita a variações aleatórias de um sistema climático naturalmente complicado.

TEORIA DO CAOS

“Em escalas maiores do que 15 dias, faz décadas que está comprovado que o clima é caótico”, diz Gilvan Sampaio de Oliveira, meteorologista do Inpe (Instituto Nacional de Pesquisas Espaciais).

“Aliás, foi a partir daí que surgiu a teoria do caos”, afirma ele, referindo-se à ideia de que, em certos fenômenos complexos, pequenas mudanças no começo podem levar a alterações muito maiores e imprevisíveis no fim.

Em regiões tropicais, como é o caso de quase todo o território do Brasil, isso é ainda mais verdadeiro, porque o calor injeta mais energia na atmosfera, fazendo com que alterações do tempo aconteçam com mais velocidade e imprevisibilidade.

Além do calor, porém, o Sudeste também tem a desvantagem de que as variações climáticas por aqui dependem de fatores não oceânicos.

“Quando o clima de uma região depende do oceano, é bem mais fácil prevê-lo porque as variações oceânicas acontecem de forma bem mais lenta do que as da atmosfera”, explica Oliveira. “É o caso do semiárido nordestino, ligado basicamente às condições do oceano Pacífico e do Atlântico tropical. Se é ano de El Niño, com o Pacífico mais aquecido, a tendência é seca no Nordeste.”

Já as chuvas do Sudeste, em especial as de verão, estão ligadas principalmente à ZCAS (Zona de Convergência do Atlântico Sul), formada pela umidade da Amazônia, que se espalha numa grande faixa que atravessa o Brasil Central, e pelas frentes frias antárticas (veja infográfico).

“Quando essa zona se fortalece você pode ter chuva constante por três, quatro, cinco dias, e é bem comum isso acontecer no Carnaval, como inclusive deve acontecer neste ano”, diz Oliveira.

Em 2014 e, em menor grau, também neste ano, contudo, a ZCAS não atuou como deveria, com um bloqueio atmosférico impedindo que as chuvas de verão atingissem o Sudeste (e o Cantareira) em cheio. As chuvas constantes e bem distribuídas voltaram apenas nas últimas semanas, porque a ZCAS parece ter se “ajeitado” de novo.

Mesmo nesse cenário, isso não significa que as chuvas de verão cessem totalmente. Com o calor típico da estação, há um ciclo rápido de evaporação e chuva –mas é um padrão local, o que explica tempestades localizadas e inundações na Grande São Paulo, sem que essas precipitações façam cócegas no Cantareira.

Há ainda outro agravante, que talvez ajude a entender a fama de imprevisível da área. Até pouco tempo atrás, não havia estações pluviométricas confiáveis para medir o volume de chuva na região do Cantareira, conta José Marengo, climatologista do Cemaden (Centro Nacional de Monitoramento e Alertas de Desastres Naturais).

“Os pluviômetros mais próximos eram os de Campos do Jordão. Faltam registros históricos. Não podemos intercalar com os dados de Campos do Jordão porque é outro regime chuva.”

Quebra de protocolo (Folha de S.Paulo)

19/02/2015

O Protocolo de Kyoto –controverso tratado de 1997 para conter a mudança do clima– teve escasso sucesso. A melhor evidência disso está no fato de as negociações internacionais sobre o tema tomarem hoje rumo muito diferente.

Nada assegura, no entanto, que uma trilha bem-sucedida se iniciará na decisiva conferência de Paris, a ser realizada em dezembro.

Neste fevereiro faz dez anos que o protocolo entrou em vigor, após a ratificação por 189 países. O acordo estabelecia que 37 nações industrializadas reduziriam suas emissões de gases do efeito estufa em 5% no período de 2008 a 2012, tomando por base o ano de 1990.

A meta foi alcançada com folga. Os países comprometidos cortaram em 22,6% a poluição que aprisiona radiação solar na atmosfera, aquecendo-a além da conta. Assim, falar em fracasso parece mesquinho.

Ocorre que as emissões globais subiram 16,2% até 2012. O clima mundial seguiu firme na rota do aquecimento porque não participavam do tratado os dois maiores poluidores, China e Estados Unidos.

O primeiro, por ser um país em desenvolvimento –e que desenvolvimento: sua economia cresceu a taxas próximas de 10% ao ano. O segundo, por força da política doméstica, uma vez que a eleição do presidente democrata Barack Obama foi incapaz de reverter o veto congressual a qualquer acordo que excluísse a China.

Ao final do primeiro período de Kyoto, em 2012, tentou-se em Doha (Qatar) um novo e mais ambicioso objetivo: reduzir em 18% as emissões até 2020. Apenas 23 nações aderiram à meta até agora.

O fracasso da via aberta em 1997 e pavimentada em 2005 foi sobretudo político. Obter consenso entre diversos países em torno de objetivos que possam ser monitorados e cobrados por uma instância internacional parece cada vez mais inviável, assim como a transferência de recursos e tecnologia, anátema entre potências desenvolvidas.

A alternativa surgida na tortuosa negociação diplomática sobre mudança do clima, após o fiasco da Cúpula de Copenhague (2009), foi afrouxar compromissos. Eles podem agora ser voluntários, e cada nação propõe quando planeja alcançá-lo e com base em qual ano.

Há amplo ceticismo, porém, quanto à chance de que a estratégia desvinculante produza o resultado esperado, a saber, impedir que o aquecimento ultrapasse 2°C neste século. Além desse limite, a ciência indica risco acentuado de desastres como secas duradouras e enchentes avantajadas.

Todas as atenções se voltam para o fim do outono em Paris.

*   *   *

Kyoto, 10, engatinha

16/02/2015

Protocolo, que entrou em vigor em 2005, fracassou em reduzir emissões mundiais de gases-estufa; para piorar, novas metas, traçadas em 2012, só tiveram 23 adesões

MAURÍCIO TUFFANI

COLABORAÇÃO PARA A FOLHA

Dez anos após ter entrado em vigor, o Protocolo de Kyoto tem um diagnóstico claro: o acordo fracassou em reduzir as emissões mundiais de gases-estufa, que cresceram 16,2% de 2005 a 2012.

O pacto internacional, porém, não foi de todo inócuo e teve certo sucesso em conscientizar a sociedade e implantar projetos ambientais, tecnológicos e de desenvolvimento econômico para prevenir o agravamento do aquecimento global.

Concluído em 1997 em Kyoto, no Japão, o protocolo estabelecia metas de redução das emissões de gases-estufa. Só em 2005 ele adquiriu força para entrar em vigor, com a ratificação pela Rússia.

O protocolo teve 189 ratificações, entre elas a do Brasil, em 2002. Mas suas novas metas de redução de emissões de 2013 a 2020, estabelecidas em 2012 no Qatar, só tiveram até agora 23 adesões.

Em um balanço, a secretaria da Convenção-Quadro das Nações Unidas sobre Mudança Climática (UNFCCC) destacou que 37 países, a maioria da União Europeia, superaram sua meta de reduzir em 5% suas emissões até 2012.

A agência, contudo, deixou de lado os números do aumento global das emissões e o alerta enfático feito em 2014 por seu braço científico, o IPCC (Painel Intergovernamental sobre Mudança Climática): não há mais tempo para reduzir a concentração de gases-estufa para que o aumento médio da temperatura da superfície terrestre até 2100 seja inferior a 2º C.

Essa elevação traria como consequência mais secas, derretimento de geleiras e inundações de zonas costeiras pela elevação dos oceanos.

Para evitar esse cenário, seria preciso reduzir as emissões em 80% até 2050.

PERDAS E GANHOS

“Estou convencida de que sem o protocolo de Kyoto não estaríamos avançados como hoje na crescente penetração das energias renováveis”, disse Christiana Figueres, secretária-executiva da UNFCCC.

Figueres também destacou cerca de 7.800 projetos de apoio a países em desenvolvimento, envolvendo benefícios de até US$ 13,5 bilhões para reduzir emissões por desmate e para “sequestro de carbono” da atmosfera por meio de recuperação e ampliação de florestas.

“Se olharmos quantitativamente para as emissões, o protocolo falhou. Mas sem ele a União Europeia não teria atingido grandes avanços nas reduções”, diz Carlos Nobre, que acaba de assumir o cargo de diretor do Cemaden (Centro de Monitoramento e Alerta de Desastres Naturais).

Nobre ressalta que a Alemanha mostrou que é possível reduzir os gases-estufa sem diminuir seu PIB. “Vejo com otimismo esse efeito pedagógico”, disse Nobre.

Já para o físico da USP e membro do IPCC Paulo Artaxo, ainda que o tratado tenha aumentado a adesão de novas tecnologias e a conscientização para o que ele chama de “problema mais sério já enfrentado pela humanidade”, houve, além do aumento da concentração de carbono na atmosfera, acúmulo de CO2 nos oceanos, o que pode causar desequilíbrios para a vida marinha.

PERSPECTIVAS

Segundo Carlos Rittl, secretário-executivo do Observatório do Clima, a próxima conferência do clima, em dezembro, em Paris, poderá ter avanços graças ao recente acordo entre EUA (que assinaram mas não ratificaram o Protocolo de Kyoto) e China.

Para ele, um dos grandes desafios para os governos, inclusive o do Brasil, é o planejamento econômico e energético. Ele afirma que isso ainda é feito sem assimilar as mudanças climáticas, e a atual crise energética e hídrica do país é prova disso.

Fabio Feldmann, secretário-executivo do Fórum Paulista de Mudanças Climáticas e de Biodiversidade e ex-secretário estadual do Meio Ambiente de São Paulo, afirma ainda que a redução de desmatamentos no Brasil criou uma “falsa impressão” de que o país pode continuar com os mesmos níveis de emissões em outros setores.

De fato, enquanto as emissões por desmates no país diminuíram 64% de 2005 a 2013, as das atividades agropecuárias e do consumo de energia cresceram, respectivamente, 6% e 42%, segundo o Observatório do Clima.

Guerra do clima (Folha de S.Paulo)

Pedidos de quebra de sigilo de cientistas crescem com a proximidade da Cúpula do Clima de Paris e acentuam embate sobre aquecimento global nos EUA

RAFAEL GARCIA

19/02/2015

ENVIADO ESPECIAL A SAN JOSE (EUA)

A animosidade entre climatologistas e grupos que questionam a atribuição do aquecimento global às emissões de CO2 tem crescido, e uma nova guerra pelo controle da informação começa a ser travada nos bastidores, principalmente nos EUA.

Os métodos usados nesse embate, porém, são diferentes daquele usado às vésperas da Cúpula do Clima de Copenhague, em 2009, quando diversos cientistas tiveram e-mails roubados e vazados na internet.

Agora, céticos do clima usam pedidos formais, baseados em leis de acesso à informação, para tentar quebrar o sigilo de correspondência dos pesquisadores.

“Veremos uma escalada similar à medida que a Cúpula do Clima de Paris se aproxima, no fim de 2015”, disse o climatologista Michael Mann, da Universidade do Estado da Pensilvânia, em palestra no encontro da AAAS (Associação Americana para o Avanço da Ciência), em San Jose.

Do encontro em Paris deve sair um novo acordo internacional para combater o aquecimento global, no lugar do Protocolo de Kyoto.

“Vai haver um esforço para confundir o público e os formuladores de politicas”, afirmou Mann.

As petições que buscam quebrar o sigilo de e-mail e anotações de cientistas em geral alegam suspeita de fraude e se baseiam em leis de transparência de informações que garante acesso a documentos produzidos por funcionários de governo.

Segundo um novo relatório da ONG Union of Concerned Scientists, esse tipo de abordagem a climatologistas cresce desde 2010, quando o promotor Ken Cuccinelli intimou a Universidade da Virgínia a liberar e-mails e anotações de Mann, que trabalhou para a instituição.

O processo se estendeu por quatro anos e, mesmo com decisão favorável ao cientista, longas horas foram consumidas para discussões com a própria universidade –que ameaçava liberar os dados temendo ser punida.

Mann foi o único a travar uma disputa pública. Mas, segundo a AGU (União Americana de Geofísica), questionamentos do tipo têm se direcionado a cientistas de instituições como Nasa, NOAA (agência oceânica e atmosférica) e o Departamento de Energia. Alguns desistem de travar a batalha legal.

Steven Dyer, da Universidade Commonwealth da Virgínia, achou que passar mais de 100 horas compilando mensagens para responder a petições seria menos dispendioso e interrompeu seu período sábatico para fazê-lo.

A entidade autora da petição –o centro de estudos conservador American Tradition Institute– passou então a exigir seus “livros de registro”. Essa e outras entidades recebem verbas da indústria do petróleo.

“Eles acham que temos um livro onde os pós-graduandos relatam o que estão fazendo”, diz Michael Helpern, autor do relatório da Union of Concerned Scientists.

Desde 2011, o congresso anual da AGU tem centro jurídico a disposição de cientistas de clima, que os orienta sobre como agir nesses casos.

“No último ano, tive muito trabalho”, conta a advogada Lauren Kurtz. Ela dirige agora o Fundo para Defesa Legal da Ciência do Clima, que levanta recursos para atender a cientistas assediados.

The West without Water: What Can Past Droughts Tell Us About Tomorrow? (Origins)

vol. 8, issue 6 – march 2015

by  B. LYNN INGRAM

Editor’s Note:
Almost as soon as European settlers arrived in California they began advertising the place as the American Garden of Eden. And just as quickly people realized it was a garden with a very precarious water supply. Currently, California is in the middle of a years-long drought and the water crisis is threatening the region’s vital agricultural economy, not to mention the quality of life of its people, plants, and animals. This month B. Lynn Ingram, Professor of Geography and Earth & Planetary Science, examines how a deep historical account of California’s water patterns can help us plan for the future.


The state of California is beginning its fourth year of a serious drought, with no end in sight.

The majority of water in the western United States is delivered by winter storms from the Pacific, and over the past year, those storms were largely blocked by an enormous ridge of high pressure. A relatively wet December has given way to the driest January on record, and currently over 90 percent of California is in severe to exceptional drought.

The southwestern states are also experiencing moderate to severe drought, and this comes on the heels of a very dry decade. This long drought has crept up on the region, partly because droughts encroach slowly and they lack the visual and visceral effects of other, more immediate natural disasters such as earthquakes, floods, or tsunamis.

Meteorologists define drought as an abnormally long period of insufficient rainfall adversely affecting growing or living conditions. But this bland definition belies the devastation wrought by these natural disasters. Drought can lead to failed crops, desiccated landscapes, wildfires, dehydrated livestock, and in severe cases, water wars, famine, and mass migration.

Although the situation in the West has not yet reached such epic proportions, the fear is that if it continues much longer, it could.

Lake Powell, in 2009, showing a white calcium carbonate “bathtub ring” exposed after a decade of drought lowered the level of the reservoir to 60 percent of its capacity. (Photo courtesy of U.S. Bureau of Reclamation.)

In California, reservoirs are currently at only 38 percent of capacity, and the snowpack is only 25 percent of normal for late January. Elsewhere in the Southwest, Lake Powell, the largest reservoir on the Colorado River, is at 44 percent of capacity.

The amount of water transported through irrigation systems to California’s Central Valley—the most productive agricultural region in the world—has been reduced to only 20 percent of customary quantities, forcing farmers to deepen groundwater wells and drill new ones.

Over the past year, 410,000 acres have been fallowed in this vast agricultural region that provides 30 percent of all the produce grown in the United States and virtually all of the world’s almonds, walnuts, and pistachios. As California dries up, food prices might well rise across the nation.

The question on everyone’s mind is when will this dry period finally come to an end and rainfall return to normal—and just what is normal for the U.S. Southwest when it comes to rain?

And with a growing and more urban population and an ever-changing climate, will we ever be free from the threat of long dry periods, with their disruptive effects on food production and the plants and animals that rely on water to survive?

A glance into the history of the Southwest reminds us that the climate and rainfall patterns have varied tremendously over time, with stretches of drought many decades longer than the one we are experiencing now.

Long dry stretches during the Medieval centuries (especially between 900 and 1350 CE) had dramatic effects on the native peoples of the Southwest (the ancestral Pueblo, Hohokam, and Sinagua), including civilizational collapse, violence, malnutrition, and forced social dislocation.

These earlier Americans are a warning to us.

The past 150 years, which we have used as our baseline for assumptions about rainfall patterns, water availability for agriculture, water laws, and infrastructure planning, may in fact be an unusually wet period.

Let’s look at the past few hundred years first and then explore the region’s climate in geological time.

Recent Droughts and the Arid Regions of the United States

John Wesley Powell stands as one of the most extraordinary scientists and explorers in America in the second half of the 19th century.

In 1869 he became the first white man to lead an expedition down the Colorado River and through the Grand Canyon, a feat all the more remarkable considering Powell had lost most of his right arm during the Civil War.

Ten years later, Powell published Report on the Lands of the Arid Regions of the United States, a careful assessment of the region’s capacity to be developed.

In it, Powell argued that very little of the West could sustain agriculture. In fact, his calculations suggested that even if all the water in western streams were harnessed, only a tiny fraction of the land could be irrigated.

Further, Powell believed that growth and development ought to be carefully planned and managed, and that boundaries drawn for new western states ought to follow watersheds to avoid inter-state fighting over precious water resources.

When Powell presented his findings to Congress, politicians howled. Powell found himself denounced by pro-development forces, including railroads and agricultural interests.

Prescient as Powell’s study has proved to be, it was almost entirely ignored at the time.

Instead, those development boosters responded to Powell’s data about the aridity of the west with a novel climatological theory: “Rain follows the plow.” They insisted that agriculture could cause the rains to fall, so like magic the more acres brought under cultivation the more rain farmers would enjoy.

The years surrounding the turn of the 20th century turned out to be unusually wet across much of the region. Hopeful pioneers continued to flock to the West, despite the visible signs of aridity.

They still do. The past century and a half in California and the West has been a period of steady population growth. And today the U.S. Southwest is the fastest-growing region in the United States (which itself is the world’s fourth-fastest-growing nation).

The Dirty Thirties and Beyond

The relatively wet period of the late nineteenth and early twentieth centuries gave way to drought in the late 1920s with the start of the Dust Bowl—now considered to be the United States’ worst climate tragedy.

The years between 1928 and 1939 were among the driest of the 20th century in the American West. This drought had particularly severe effects on California’s developing agricultural industry that were only mitigated by the extensive pumping of groundwater that eventually caused the ground surface in California’s Central Valley to drop by several feet.

Donner Lake, Sierra Nevada Range, California (Photo taken by B. Lynn Ingram).

In the 20th century, the single driest year (rivaling the 2013-2014 water year) was the drought of 1976-1977, extending across the entire state of California and into the Northwest, the Midwest, and the Canadian Prairie region north of Montana.

In California, precipitation levels dropped to less than a quarter of average. Reservoirs dropped to one-third their normal levels, and 7.5 million trees in the Sierra Nevada weakened by drought succumbed to insect related diseases, fueling massive wildfires. Snowfall was extremely sparse, forcing ski areas to close.

The following decade, another six-year drought occurred from 1987 to 1992, and while no single year was as severe as the drought of 1976-1977, the cumulative effects were ultimately more devastating. Annual precipitation attained only 50 percent of the 20th century average, with far-ranging impacts.

In the Sierra Nevada, water-stressed trees suffered widespread mortality from pine bark beetle infestations. Reduced stream flow caused major declines in fish populations, affecting commercial and recreational fisheries by lowering populations of Chinook salmon and striped bass.

By the fourth year of the drought, reservoir storage statewide was down 60 percent, causing a decline in hydroelectric power generation and the imposition of water restrictions including a decrease in agricultural water delivery by 75 percent.

Farmers relied more on groundwater, with private well owners deepening existing wells or drilling new ones. In the San Joaquin Valley, 11 million acre-feet more groundwater was extracted than could be replenished naturally, further lowering already low groundwater levels.

Measuring Droughts over Geological Time

As bad and worrisome as these more recent historical droughts in California and the West were, they pale in comparison to events uncovered in the geological record.

In recent years, earth scientists have been discovering that the climate and weather in the West over the past 100 to 150 years represents only a narrow part of the full range of climate in the region.

By peering deeper into Earth’s history—the past centuries and millennia—the frequency and magnitude of extreme climate events like drought can be better understood.

The evidence comes in various forms, such as mud from the bottom of lakes and ponds, microscopic organisms living in the oceans, bubbles frozen in glaciers, pencil-thin wood cores drilled from trees, and salts precipitating in dried-up lake bottoms.

A cut section of a Giant Sequoia trunk from Tuolumne Grove, Yosemite National Park, California, showing AD dates of fires (photo courtesy of Thomas Swetnam, Laboratory of Tree-Ring Research, University of Arizona).

One of the earliest records of past climate change comes from the rings of the long-lived Douglas fir. Trees are particularly effective recorders of climate because they respond every year to conditions of temperature and precipitation, responses recorded in the growth rings of their trunks.

In a landmark study during the early 1940s, a 600-year record of Colorado River flow using Douglas firs revealed several sustained periods of low water flow and these periods recurred with some regularity.

The reconstruction showed a particularly severe drought in the late 1500s, a drought lasting over a decade that has since shown up in multiple records from throughout the West.

These records also reveal that the driest single year over the past millennium (even drier than the parched 1976-1977 drought) occurred in 1580 CE. Trees across the West either had a narrow ring, or even a missing ring, that year.

Looking at an even broader picture, evidence from the past 10 millennia—a relatively warm era since the last Ice Age, which we call the Holocene—informs us that the severity of past extreme events (including droughts and floods) far exceeds those experienced over the past century and a half.

One of the longest dry periods for California and the West occurred during what is known as the mid-Holocene climatic optimum, a time when much of the earth experienced warmer than average conditions from about 4,500 to 7,500 years ago.

In the American West, there are numerous clues showing that this time period was drier than average for upwards of 1,400 years. These climate extremes caused significant human dislocations and forced native populations to migrate from the desert interiors of the West to the coastal regions.

The Tools for Uncovering Climate History

One of the most vivid clues for understanding the patterns of past drought in the West was revealed in Lake Tahoe toward the end of the Great Dust Bowl of the mid-1930s. At that time, Tahoe’s water level dropped fourteen inches, exposing a mysterious clustering of tree stumps sticking up from the water’s surface along the lake’s southern shore.

These trees attracted the attention of Samuel Harding, an engineering Professor from the University of California, Berkeley. Harding discovered that the trees were large, with trunks as wide as three feet in diameter, and appeared to be firmly rooted in the lake bottom.

Harding reasoned that the trees had grown in this location for a long time to attain such sizes, and since they were now submerged in over twelve feet of water, he surmised that at some time in the past the lake level had been much lower.

Frances Malamud-Roam, B. Lynn Ingram, and Christina Brady coring a small oxbow lake in the Sacramento Valley, California. (Photo taken by Anders Noren, University of Minnesota, LaCore curator.)

After collecting cores through their trunks, he counted up to 150 rings, concluding that it was a dry spell of over a century that caused the lake level to drop, allowing the trees to grow along the former shoreline.

Harding had to wait two decades before he could date this drought, after the invention of radiocarbon dating in the 1950s. Radiocarbon measurements of the outermost rings of the tree stumps showed that these trees died approximately 4,800 years ago.

Decades later, more evidence emerged from Lake Tahoe during another of California’s droughts in the late 1980s, when the lake’s surface dropped again, exposing even more tree stumps.

This time, it was an archaeologist, Susan Lindstrom, who noticed the tops of trees sticking out of the water along Tahoe’s southern shore. Donning scuba gear, Lindstrom was able to find fifteen submerged tree stumps that had escaped Harding’s attention, some measuring up to three and a half feet in diameter.

The radiocarbon dates from this much larger population of trees refined and extended the boundaries of the mid-Holocene drought, moving the beginning to as early as 6,290 years ago, and the ending to 4,840 years ago.

These stumps, located deeper in the lake, showed that the lake level had dropped by even more than Harding originally thought – by more than 20 feet. Lindstrom and other researchers have since located tree stumps in more places around the shores of Lake Tahoe and in other Sierran lakes.

Sediment core taken by Frances Malamud-Roam and B. Lynn Ingram from beneath San Francisco Bay, California. (Photo taken by B. Lynn Ingram.)

Geologists have also discovered more evidence from sediment cores taken from beneath lakes revealing the wide extent of this drought—across California and the Great Basin.

The archaeological records show that native populations migrated from the inland desert regions to the California coast at this time, likely in search of water and other resources during this prolonged drought.

Another dry millennium began about 3,000 years after the mid-Holocene drought ended. Evidence for this prolonged drought was found throughout California and the West.

One study, conducted in my laboratory at UC Berkeley, examined sediments accumulating beneath the San Francisco Bay estuary. These sediments contain information about precipitation over the entire drainage basin of the Sacramento and San Joaquin Rivers—an area that covers 40 percent of California.

Frances Malamud-Roam and Anders Noren coring marsh sediments adjacent to San Francisco Bay (Photo taken by B. Lynn Ingram)

Rivers draining the Sierra Nevada Range and Central Valley flow through San Francisco Bay and out the Golden Gate to the Pacific Ocean. In the Bay, fresh river water meets and mixes with the incoming ocean water, producing a range of salinity: fresh at the Delta, saline in the Central bay near the Golden Gate, and brackish in between.

Organisms growing in the Bay record the salinity in their shells, which then sink to the bottom and are preserved in the sediments. We took sediment cores from beneath the Bay and analyzed the chemistry of the fossil shells, allowing us to reconstruct past salinity, and therefore past river flow.

These studies showed that droughts lasting over a decade occurred regularly over the past two millennia, at intervals of 50 to 90 years. The cores also revealed a period of high salinity that began about 1,700 years ago and ending about 700 years ago, suggesting another prolonged drought.

We conducted a related study with Professor Roger Byrne in the Geography Department at UC Berkeley, coring the tidal marshlands surrounding the bay to assess the impact of this drought on this ecosystem.

These marshes have grown up around the edges of San Francisco Bay for the past 5,000 years or so, forming peat. The marsh peats contain fossil plants and chemical evidence for past periods of wetter and drier conditions in the watershed.

A drought in the watershed, if prolonged and severe, can cause higher salinity downstream in the estuary as the inflow of fresh water drops. In response, salt-tolerant species in the marshes expand further inland toward the Delta and the fresh water species retreat. Conversely, unusually wet winters generate fresher conditions in the estuary, leading to an expansion of freshwater-adapted species.

We analyzed the pollen and plant remains, carbon chemistry of the peats, and diatoms—the microscopic phytoplankton that grow in the marshes and produce tiny silica shells.

All of this evidence showed that the average freshwater inflow to San Francisco Bay was significantly lower than today’s levels for a thousand years, between 1,750 and 750 years ago.

The peak of this low-inflow interval, with freshwater flows 40 percent below average levels, occurred approximately 900 to 1,200 years ago, during a time when global temperatures were high, known as the Medieval Warm Period.

Mono Lake, showing calcium carbonate “tufa tower” formations that originally formed beneath the lake but are now exposed after the water level dropped. The eastern flank of the Sierra Nevada range is shown in the background. (Photo by D. J. DePaolo.)

Evidence for this drought was also discovered in an ancient lake situated east of the Sierra Nevada. Geography Professor Scott Stine analyzed the sedimentary sequences in Mono Lake, delineating patterns of alternately higher and lower lake levels for the past 4,000 years.

Mono Lake experienced an extended low stand that began about 1,600 years ago, dropping to an even lower level 700 to 1,200 years ago. During the 1980s drought, Stine also discovered large tree stumps submerged in Mono Lake.

Much like the tree stumps discovered in Lake Tahoe, these submerged trees indicated that at one time the lake was so small that its shoreline was several tens of feet lower than the present shoreline, when the trees now underwater could grow on dry ground. Stine went on to discover similar submerged tree stumps in lakes, marshes, and rivers throughout the central and southern Sierra Nevada Range.

By counting their growth rings, Stine determined that they had lived up to 160 years. Based on the amount the lake level dropped, he calculated that the average annual river flows in the region were only 40 to 60 percent of what they were in the late 20th century.

Radiocarbon dates of the outer growth layers of these tree stumps revealed that these trees clustered around two distinct periods, now known as the “Medieval Megadroughts”: CE 900 to 1100 and CE 1200 to 1350.

An ancient tree stump submerged in the West Walker River, eastern Sierra Nevada. (Photo courtesy of D. J. DePaolo.)

Across North America, tree-ring studies indicate that climate conditions over the past two millennia became steadily more variable (shifting between drier and wetter periods), with especially severe droughts between CE 900 and 1400.

These records show that over half the American West suffered severe drought between CE 1021 and CE 1051, and from CE 1130-1170, CE1240-1265 and CE 1360-1382.

The warm and dry conditions of the Medieval period spawned larger and more frequent wildfires, as recorded in the trunks of Giant sequoias—the massive redwoods growing in about 75 distinct groves along the mid-elevations of the western Sierra Nevada. These spectacular trees can live up to 3,200 years or more, and have exceeded 250 feet in height and 35 feet in diameter.

Thomas Swetnam, the current Director of the Laboratory of Tree Ring Research at the University of Arizona, discovered that the trees carry scars on their annual growth rings that indicate past fires in the region.

Swetnam sampled giant sequoias from five groves between Yosemite National Park and Sequoia National Park, far enough apart that individual fires could not have spread from one grove to the next. He dated the trees using ring-width patterns, and recorded the fire scars contained within annual rings.

His analysis reveals that during the Medieval period, from 1,200 to 700 years ago, an average of thirty-six fires burned every century.

During the centuries preceding the Medieval period (from about 1,500 to 1,200 years ago) and immediately following it (from about 700 years ago to the current century), the fire frequency was substantially lower, with an average of 21 fires per century.

The Human Costs of Droughts Then and Now

The archaeological record suggests that the extended periods of drought in the Medieval era caused severe hardship for both coastal and inland peoples— particularly the ancestral Pueblo communities—as dwindling resources increased disease, malnutrition, and warfare. Long inhabited sites were abandoned as the desperate populations wandered in search of new water sources.

Ancient pueblo cliff dwelling at Mesa Verde, southwestern Colorado. (Photo taken by B. Lynn Ingram)

Much of what archaeologists know about the ancestral Pueblo comes from pueblo and cliff dwellings from the four corners region, including Chaco Canyon in northwestern New Mexico, Mesa Verde in southwestern Colorado, and Canyon de Chelly in northeastern Arizona.

Chaco Canyon in New Mexico was the site of one of the most extensive of the ancestral Pueblo settlements. At its peak, during the 11th and early 12th centuries CE, Chaco Canyon had great pueblos the size of apartment blocks housing hundreds of residents in large, high-ceilinged rooms.

These settlements were supported by agriculture, allowing people to settle in one place year-round. Most of the farming depended on annual rains, supplemented by water from nearby streams and groundwater.

But over time, the climate became increasingly arid and unpredictable. The ancestral Pueblo farmers were forced to build an extensive system of diversion dams and canals, directing rainwater from the mesa tops to fields on the canyon floor, allowing them to expand the area of arable land.

The population in the four corners region swelled throughout the 11th and 12th centuries CE—but then collapsed.

Another ancient society, the Hohokam, lived in central Arizona near the confluence of Arizona’s only three rivers, the Gila, Verde, and Salt. The Hohokam civilization thrived in central Arizona for a thousand years, building an extensive network of integrated canal systems, capable of transporting large volumes of water long distances.

At their peak, an estimated 40,000 Hohokam lived in Arizona, but they suddenly vanished in the mid-15th century.

Montezuma’s Castle, a cliff dwelling occupied by the Sinagua, located just north of Camp Verde in central Arizona. (Photo by B. Lynn Ingram.)

In northern Arizona, between Phoenix and Flagstaff, the Sinagua culture also thrived during this period. As the climate turned drier, they built cliff dwellings in central Arizona, suggesting that resources became scarce, forcing them to build fortified dwellings with hidden food storage areas. The Sinagua also disappeared about the same time as the Hohokam.

All of these societies were flourishing prior to a rather abrupt collapse. The archaeological record of the last decades of the ancestral Pueblo in Chaco Canyon abounds with signs of suffering.

Skeletal remains show signs of malnutrition, starvation and disease; life spans declined and infant mortality rates increased. Evidence of violence, possibly warfare, was found in mass graves containing bones penetrated with arrowheads and teeth marks, and skulls bearing the scars of scalping.

Piles of belongings were found, apparently left behind as the people abandoned their settlements and fled, some to live in fortified hideouts carved in the cliff faces, protecting their hoarded food from enemies.

The unusually dry climate of the Medieval period also appeared to have tested the endurance and coping strategies of even the well-adapted native populations in California.

The skeletal remains show that life in the interior of California was particularly difficult, as the drought severely reduced sources of food (nuts, plants, deer, and other game). Settlements along rivers were abandoned, and trade between inland and coastal groups broke down. As water supplies dried up, conflicts – even battles – between groups arose over territory and food and water resources.

The Watery Lessons of the Past

The “Medieval Drought” serves as a model for what can happen in the West. It also provides an important impetus for water sustainability planning. And the hardships suffered by the first human inhabitants in the West provide important lessons.

For instance, during extended periods of abundant moisture, some societies experienced rapid population growth, leaving them vulnerable to collapse when the climate inevitably turned dry again.

Modern societies in the West have followed a similar path over the past century— after a century of fairly abundant moisture, the population in this region has exploded (and become more urbanized).

Modern engineering has allowed the exploitation of all available water sources for human use, and western water policy has favored water development for power, cities, and farms over sustainability of the environment and ecosystems.

These policies have allowed populations to grow to the limit that this region can support, leaving us vulnerable during extended drier conditions.

The longest six-year droughts experienced by the West over the past century are meager by comparison, despite the extreme hardship they brought to the region.

In fact, in the context of the longer-term climate history, the 20th century actually stands out as one of the wettest over the past 1,300 years, yet the droughts of the mid-1920s, 1977 and the late 1980s caused immense hardship for our society, based as it is upon heavy water usage.

In addition, future changes in the global climate will interact with the natural cycles of drought in California and the West in ways that are difficult to predict. Climate models predict that warming will likely make the extreme events, particularly floods and droughts, even larger and more frequent.

Some of these impacts have already begun. Over the past two decades, warming and an earlier start of the spring season have caused forest fires to become more frequent and intense.

A warmer climate will also bring less precipitation that falls as snow. The American West depends on snow-bearing winter storms for a natural water reservoir. This snow begins melting in the late spring, and continues into the summer, filling streams, lakes, and reservoirs that sustain natural ecosystems throughout the dry summer months.

The snow pack supports cities and irrigated agriculture, providing up to 80 percent of the year’s water supply across the West. As the region warms, the snow that does fall will melt faster and earlier in the spring, rather than melting during the late spring and summer, when it is so critically needed.

The message of past climates is that the range of “normal” climate is enormous—and we have experienced only a relatively benign portion of it in recent history. The region’s climate over the past decade has been dry when compared to the 20th century average, suggesting a return to a drier period.

This past year was also the warmest on record in the American West, and the ten hottest years on record occurred since 1997. The position of inhabitants of the West is precarious now and growing more so.

As we continue with an unsustainable pattern of water use, we become more vulnerable each year to a future we cannot control. It is time for policy makers in the West to begin taking action toward preparing for drier conditions and decreased water availability.


Read more from Origins on Water and the Environment: The World Water CrisisThe River JordanWho Owns the Nile?The Changing ArcticOver-Fishing in American WatersClimate Change and Human Population; and the Global Food Crisis.


Suggested Reading

Benson, L., Kashgarian, M., Rye, R., Lund, S., Paillet, F., Smoot, J., Kester, C., Mensing, S., Meko, D. and Lindstrom, S., 2002. “Holocene Multidecadal and Multi-centennial Droughts Affecting Northern California and Nevada.” Quaternary Science Reviews 21, 659-682.

Bradley, R.S., Briffa, K.R., Cole, J., Hughes, M.K., and Osborn, T.J., 2003. “The climate of the last millennium.” In: Alverson, K, Bradley, R.S., and Pedersen, T.F. (Eds.), Paleoclimate, Global Change and the Future, Springer Verlag, Berlin, pp. 105-49.

Brunelle, A. and Anderson, R.S., 2003. “Sedimentary charcoal as an indicator of late-Holocene drought in the Sierra Nevada, California, and its relevance to the future. “ The Holocene 13(1), 21-28.

Cayan, D. R., S. A. Kammerdiener, M. D. Dettinger, J. M. Caprio, and D. H. Peterson, 2001. “Changes in the onset of spring in the Western United States.” Bull. Am. Met. Soc., 82, 399-415.

Fagan, B., 2003. Before California: an Archaeologist Looks at Our Earliest Inhabitants. Rowman and Littlefield Publishers, Inc, Lanham, MD. 400 p.

Gleick, P.H. and E.L. Chalecki. 1999.” The impacts of climatic changes for water resources of the Colorado and Sacramento-San Joaquin river basins.” Journal of the American Water Resources Association, Vol. 35, No. 6, pp.

Hughes, M.K. and Brown, P.M., 1992. “Drought frequency in central California since 101 B.C. recorded in giant sequoia tree rings.” Climate Dynamics 6,161-197

Ingram, B. Lynn and Malamud-Roam, F. (2013) The West without Water: What past floods, droughts, and other climatic clues tell us about tomorrow. UC Press, 256 pages.

Ingram, B. L., Conrad, M.E., and Ingle, J.C., 1996. “A 2000-yr record of Sacramento-San Joaquin River inflow to San Francisco Bay estuary, California.” Geology 24, 331-334.

Lightfoot, K., 1997. “Cultural construction of coastal landscapes: A middle Holocene perspective from San Francisco Bay.” In: Erlandson, J. and Glassow, M. (eds), Archaeology of the California Coast during the Middle Holocene, 129-141. Series, Perspectives in California Archaeology 4, Institute of Archaeology, Univ. of California.

Malamud-Roam, F. and B.L. Ingram. 2004. “Late Holocene d13C and pollen records of paleosalinity from tidal marshes in the San Francisco estuary.” Quaternary Research 62, 134-145.

Stahle, D. W., Cook, E. R., Cleaveland, M. K., Therrell, M. D., Meko, D. M., Grissino-Mayer, H. D., Watson, E., and Luckman, B., 2000. “Tree-ring data document 16th century megadrought over North America.” EOS Transactions of the American Geophysical Union 81 (12), 121-125.

Stine, S., 1990. “Past Climate At Mono Lake.” Nature 345: 391.

Stine, S., 1994. “Extreme and persistent drought in California and Patagonia during mediaeval time.” Nature 369: 546-549.

Swetnam, T.W. 1993. “Fire history and climate change in Giant Sequoia groves.” Science 262, 885.

Panel Urges Research on Geoengineering as a Tool Against Climate Change (New York Times)

Piles at a CCI Energy Solutions coal handling plant in Shelbiana, Ky. Geoengineering proposals might counteract the effects of climate change that are the result of burning fossils fuels, such as coal. Credit: Luke Sharrett/Getty Images 

With the planet facing potentially severe impacts from global warming in coming decades, a government-sponsored scientific panel on Tuesday called for more research on geoengineering — technologies to deliberately intervene in nature to counter climate change.

The panel said the research could include small-scale outdoor experiments, which many scientists say are necessary to better understand whether and how geoengineering would work.

Some environmental groups and others say that such projects could have unintended damaging effects, and could set society on an unstoppable path to full-scale deployment of the technologies.

But the National Academy of Sciences panel said that with proper governance, which it said needed to be developed, and other safeguards, such experiments should pose no significant risk.

In two widely anticipated reports, the panel — which was supported by NASA and other federal agencies, including what the reports described as the “U.S. intelligence community” — noted that drastically reducing emissions of carbon dioxide and other greenhouse gases was by far the best way to mitigate the effects of a warming planet.

A device being developed by a company called Global Thermostat, is made to capture carbon dioxide from the air. This may be one solution to counteract climate change.CreditHenry Fountain/The New York Times 

But the panel, in making the case for more research into geoengineering, said, “It may be prudent to examine additional options for limiting the risks from climate change.”

“The committee felt that the need for information at this point outweighs the need for shoving this topic under the rug,” Marcia K. McNutt, chairwoman of the panel and the editor in chief of the journal Science, said at a news conference in Washington.

Geoengineering options generally fall into two categories: capturing and storing some of the carbon dioxide that has already been emitted so that the atmosphere traps less heat, or reflecting more sunlight away from the earth so there is less heat to start with. The panel issued separate reports on each.

The panel said that while the first option, called carbon dioxide removal, was relatively low risk, it was expensive, and that even if it was pursued on a planetwide scale, it would take many decades to have a significant impact on the climate. But the group said research was needed to develop efficient and effective methods to both remove the gas and store it so it remains out of the atmosphere indefinitely.

The second option, called solar radiation management, is far more controversial. Most discussions of the concept focus on the idea of dispersing sulfates or other chemicals high in the atmosphere, where they would reflect sunlight, in some ways mimicking the effect of a large volcanic eruption.

The process would be relatively inexpensive and should quickly lower temperatures, but it would have to be repeated indefinitely and would do nothing about another carbon dioxide-related problem: the acidification of oceans.

This approach might also have unintended effects on weather patterns around the world — bringing drought to once-fertile regions, for example. Or it might be used unilaterally as a weapon by governments or even extremely wealthy individuals.

Opponents of geoengineering have long argued that even conducting research on the subject presents a moral hazard that could distract society from the necessary task of reducing the emissions that are causing warming in the first place.

“A geoengineering ‘technofix’ would take us in the wrong direction,” Lisa Archer, food and technology program director of the environmental group Friends of the Earth, said in a statement. “Real climate justice requires dealing with root causes of climate change, not launching risky, unproven and unjust schemes.”

But the panel said that society had “reached a point where the severity of the potential risks from climate change appears to outweigh the potential risks from the moral hazard” of conducting research.

Ken Caldeira, a geoengineering researcher at the Carnegie Institution for Science and a member of the committee, said that while the panel felt that it was premature to deploy any sunlight-reflecting technologies today, “it’s worth knowing more about them,” including any problems that might make them unworkable.

“If there’s a real showstopper, we should know about it now,” Dr. Caldeira said, rather than discovering it later when society might be facing a climate emergency and desperate for a solution.

Dr. Caldeira is part of a small community of scientists who have researched solar radiation management concepts. Almost all of the research has been done on computers, simulating the effects of the technique on the climate. One attempt in Britain in 2011 to conduct an outdoor test of some of the engineering concepts provoked a public outcry. The experiment was eventually canceled.

David Keith, a researcher at Harvard University who reviewed the reports before they were released, said in an interview, “I think it’s terrific that they made a stronger call than I expected for research, including field research.” Along with other researchers, Dr. Keith has proposed a field experiment to test the effect of sulfate chemicals on atmospheric ozone.

Unlike some European countries, the United States has never had a separate geoengineering research program. Dr. Caldeira said establishing a separate program was unlikely, especially given the dysfunction in Congress. But he said that because many geoengineering research proposals might also help in general understanding of the climate, agencies that fund climate research might start to look favorably upon them.

Dr. Keith agreed, adding that he hoped the new reports would “break the logjam” and “give program managers the confidence they need to begin funding.”

At the news conference, Waleed Abdalati, a member of the panel and a professor at the University of Colorado, said that geoengineering research would have to be subject to governance that took into account not just the science, “but the human ramifications, as well.”

Dr. Abdalati said that, in general, the governance needed to precede the research. “A framework that addresses what kinds of activities would require governance is a necessary first step,” he said.

Raymond Pierrehumbert, a geophysicist at the University of Chicago and a member of the panel, said in an interview that while he thought that a research program that allowed outdoor experiments was potentially dangerous, “the report allows for enough flexibility in the process to follow that it could be decided that we shouldn’t have a program that goes beyond modeling.”

Above all, he said, “it’s really necessary to have some kind of discussion among broader stakeholders, including the public, to set guidelines for an allowable zone for experimentation.”