Arquivo da tag: Risco

Michael E. Mann: “My Comments on New National Academy Report on Geoengineering”

By Michael E. Mann on Thursday, March 25, 2021 – 12:26

Original text

The U.S. National Academy of Sciences has published a new report (“Reflecting Sunlight“) on the topic of Geoengineering (that is, the deliberate manipulation of the global Earth environment in an effort to offset the effects of human carbon pollution-caused climate change). While I am, in full disclosure, a member of the Academy, I offer the following comments in an entirely independent capacity:

Let me start by congratulating the authors on their comprehensive assessment of the science. It is solid as we would expect, since the author team and reviewers cover that well in their expertise. The science underlying geoengineering is the true remit of the study. Chris Field , the lead author, is a duly qualified person to lead the effort, and did a good job making sure that intricacies of the science are covered, including the substantial uncertainties and caveats when it comes to the potential environmental impacts of some of the riskier geoengineering strategies (i.e. stratosphere sulphate aerosol injection to block out sunlight).

I like the fact that there is a discussion of the importance of labels and terminology and how this can impact public perception. For example, the oft-used term “solar radiation management” is not favored by the report authors, as it can be misleading (we don’t have our hand on a dial that controls solar output). On the other hand, I think that the term they do chose to use “solar geoengineering”, is still potentially problematic, because it still implies we’re directly modify solar output—but that’s not the case. We’re talking about messing with Earth’s atmospheric chemistry, we’re not dialing down the sun, even though many of the modeling experiments assume that’s what we’re doing. It’s a bit of a bait and switch. Even the title of the report, “Reflecting Sunlight” falls victim to this biased framing.

In my recent book (“The New Climate War”), I quote one leading scientist on this:

“They don’t actually put aerosols in the atmosphere. They turn down the Sun to mimic geoengineering. You might think that is relatively unimportant . . . [but] controlling the Sun is effectively a perfect knob. We know almost precisely how a reduction in solar flux will project onto the energy balance of a planet. Aerosol-climate interactions are much more complex.”

I have a deeper and more substantive concern though, and it really is about the entire framing of the report. A report like this is as much about the policy message it conveys as it is about the scientific assessment, for it will be used immediately by policy advocates. And here I’m honestly troubled at the fodder it provides for mis-framing of the risks.

I recognize that the authors are dealing with a contentious and still much-debated topic, and it’s a challenge to represent the full range of views within the community, but the opening of the report itself, in my view, really puts a thumb on the scales. It falls victim to the moral hazard that I warn about in “The New Climate War” when it states, as justification for potentially considering implementing these geoengineering schemes:

But despite overwhelming evidence that the climate crisis is real and pressing, emissions of greenhouse gases continue to increase, with global emissions of fossil carbon dioxide rising 10.8 percent from 2010 through 2019. The total for 2020 is on track to decrease in response to decreased economic activity related to the COVID-19 pandemic. The pandemic is thus providing frustrating confirmation of the fact that the world has made little progress in separating economic activity from carbon dioxide emissions.

First of all, the discussion of carbon emissions reductions there is misleading. Emissions flattened in the years before the pandemic, and the International Energy Agency (IEA) specifically attributed that flattening to a decrease in carbon emissions globally in the power generation sector. These reductions continue on and contributed at least party to the 7% decrease in global emissions last year. We will certainly need policy interventions favoring further decarbonization to maintain that level of decrease year after year, but if we can do that, we remain on a path to limiting warming below dangerous levels (decent chance less than 1.5C and very good chance less than 2C) without resorting on very risky geoengineering schemes. It is a matter of political willpower, not technology–we have the technology now necessary to decarbonize our economy.

The authors are basically arguing that because carbon reductions haven’t been great enough (thanks to successful opposition by polluters and their advocates) we should consider geoengineering. That framing (unintentionally, I realize) provides precisely the crutch that polluters are looking for.

As I explain in the book:

A fundamental problem with geoengineering is that it presents what is known as a moral hazard, namely, a scenario in which one party (e.g., the fossil fuel industry) promotes actions that are risky for another party (e.g., the rest of us), but seemingly advantageous to itself. Geoengineering provides a potential crutch for beneficiaries of our continued dependence on fossil fuels. Why threaten our economy with draconian regulations on carbon when we have a cheap alternative? The two main problems with that argument are that (1) climate change poses a far greater threat to our economy than decarbonization, and (2) geoengineering is hardly cheap—it comes with great potential harm.

So, in short, this report is somewhat of a mixed bag. The scientific assessment and discussion is solid, and there is a discussion of uncertainties and caveats in the detailed report. But the spin in the opening falls victim to moral hazard and will provide fodder for geoengineering advocates to use in leveraging policy decision-making.

I am somewhat troubled by that.

Opinion: Bill Gates and Warren Buffett should thank American taxpayers for their profitable farmland investments (Market Watch)

www-marketwatch-com.cdn.ampproject.org

Last Updated: March 10, 2021 at 5:59 p.m. ET First Published: March 10, 2021 at 8:28 a.m. ET By

Vincent H. Smith and Eric J. Belasco

Congress has reduced risk by underwriting crop prices and cash revenues

Bill Gates is now the largest owner of farmland in the U.S. having made substantial investments in at least 19 states throughout the country. He has apparently followed the advice of another wealthy investor, Warren Buffett, who in a February 24, 2014 letter to investors described farmland as an investment that has “no downside and potentially substantial upside.”

There is a simple explanation for this affection for agricultural assets. Since the early 1980s, Congress has consistently succumbed to pressures from farm interest groups to remove as much risk as possible from agricultural enterprises by using taxpayer funds to underwrite crop prices and cash revenues.

Over the years, three trends in farm subsidy programs have emerged.

The first and most visible is the expansion of the federally supported crop insurance program, which has grown from less than $200 million in 1981 to over $8 billion in 2021. In 1980, only a few crops were covered and the government’s goal was just to pay for administrative costs. Today taxpayers pay over two-thirds of the total cost of the insurance programs that protect farmers against drops in prices and yields for hundreds of commodities ranging from organic oranges to GMO soybeans.

The second trend is the continuation of longstanding programs to protect farmers against relatively low revenues because of price declines and lower-than-average crop yields. The subsidies, which on average cost taxpayers over $5 billion a year, are targeted to major Corn Belt crops such as soybeans and wheat. Also included are other commodities such as peanuts, cotton and rice, which are grown in congressionally powerful districts in Georgia, the Carolinas, Texas, Arkansas, Mississippi and California.

The third, more recent trend is a return over the past four years to a 1970s practice: annual ad hoc “one off” programs justified by political expediency with support from the White House and Congress. These expenditures were $5.1 billion in 2018, $14.7 billion in 2019, and over $32 billion in 2020, of which $29 billion came from COVID relief funds authorized in the CARES Act. An additional $13 billion for farm subsidies was later included in the December 2020 stimulus bill.

If you are wondering why so many different subsidy programs are used to compensate farmers multiple times for the same price drops and other revenue losses, you are not alone. Our research indicates that many owners of large farms collect taxpayer dollars from all three sources. For many of the farms ranked in the top 10% in terms of sales, recent annual payments exceeded a quarter of a million dollars.

Farms with average or modest sales received much less. Their subsidies ranged from close to zero for small farms to a few thousand dollars for averaged-sized operations.

So what does all this have to do with Bill Gates, Warren Buffet and their love of farmland as an investment? In a financial environment in which real interest rates have been near zero or negative for almost two decades, the annual average inflation-adjusted (real) rate of return in agriculture (over 80% of which consists of land) has been about 5% for the past 30 years, despite some ups and downs, as this chart shows. It is a very solid investment for an owner who can hold on to farmland for the long term.

The overwhelming majority of farm owners can manage that because they have substantial amounts of equity (the sector-wide debt-to-equity ratio has been less than 14% for many years) and receive significant revenue from other sources.

Thus for almost all farm owners, and especially the largest 10% whose net equity averages over $6 million, as Buffet observed, there is little or no risk and lots of potential gain in owning and investing in agricultural land. 

Returns from agricultural land stem from two sources: asset appreciation — increases in land prices, which account for the majority of the gains — and net cash income from operating the land. As is well known, farmland prices are closely tied to expected future revenue. And these include generous subsidies, which have averaged 17% of annual net cash incomes over the past 50 years. In addition, Congress often provides substantial additional one-off payments in years when net cash income is likely to be lower than average, as in 2000 and 2001 when grain prices were relatively low and in 2019 and 2020.

It is possible for small-scale investors to buy shares in real-estate investment trusts (REITs) that own and manage agricultural land. However, as with all such investments, how a REIT is managed can be a substantive source of risk unrelated to the underlying value of the land assets, not all of which may be farm land.

Thanks to Congress and the average less affluent American taxpayer, farmers and other agricultural landowners get a steady and substantial return on their investments through subsidies that consistently guarantee and increase those revenues.

While many agricultural support programs are meant to “save the family farm,” the largest beneficiaries of agricultural subsidies are the richest landowners with the largest farms who, like Bill Gates and Warren Buffet, are scarcely in any need of taxpayer handouts.

Vincent H. Smith is director of agricultural studies at the American Enterprise Institute, a Washington, D.C. think tank, and professor of economics at Montana State University. Eric J. Belasco is a visiting scholar at AEI.

Texas’s Power Crisis Has Turned Into a Disaster That Parallels Hurricane Katrina (TruthOut)

truthout.org

Sharon Zhang, Feb. 18, 2021


Propane tanks are placed in a line as people wait for the power to turn on to fill their tanks in Houston, Texas on February 17, 2021.
Propane tanks are placed in a line as people wait for the power to turn on to fill their tanks in Houston, Texas, on February 17, 2021. Mark Felix for The Washington Post via Getty Images

As many in Texas wake up still without power on Thursday morning, millions are now also having to contend with water shutdowns, boil advisories, and empty grocery shelves as cities struggle with keeping infrastructure powered and supply chains are interrupted.

As of estimates performed on Wednesday, 7 million Texans were under a boil advisory. Since then, Austin has also issued a citywide water-boil notice due to power loss at their biggest water treatment plant. Austin Water serves over a million customers, according to its website.

With hundreds of thousands of people still without power in the state, some contending that they have no water coming out of their faucets at all, and others facing burst pipes leading to collapsed ceilings and other damage to their homes, the situation is dire for many Texans facing multiple problems at once.

Even as some residents are getting their power restored, the problems are only continuing to layer as the only grocery stores left open were quickly selling out of food and supplies. As many without power watched their refrigerated food spoil, lines to get into stores wrapped around blocks and buildings and store shelves sat completely empty with no indication of when new shipments would be coming in. Food banks have had to cancel deliveries and schools to halt meal distribution to students, the Texas Tribune reports.

People experiencing homelessness, including a disproportionate number of Black residents, have especially suffered in the record cold temperatures across the state. There have been some reports of people being found dead in the streets because of a lack of shelter.

“Businesses are shut down. Streets are empty, other than a few guys sliding around in 4x4s and fire trucks rushing to rescue people who turn their ovens on to keep warm and poison themselves with carbon monoxide,” wrote Austin resident Jeff Goodell in Rolling Stone. “Yesterday, the line at our neighborhood grocery store was three blocks long. People wandering around with handguns on their hip adds to a sense of lawlessness (Texas is an open-carry state).”

The Texas agricultural commissioner has said that farmers and ranchers are having to throw away millions of dollars worth of goods because of a lack of power. “We’re looking at a food supply chain problem like we’ve never seen before, even with COVID-19,” he told one local news affiliate.

An energy analyst likened the power crisis to the fallout of Hurricane Katrina as it’s becoming increasingly clear that the situation in Texas is a statewide disaster.

As natural gas output declined dramatically in the state, Paul Sankey, who leads energy analyst firm Sankey Research, said on Bloomberg, “This situation to me is very reminiscent of Hurricane Katrina…. We have never seen a loss [of energy supply] at this scale” in mid-winter. This is “the biggest outage in the history [of] U.S. oil and gas,” Sankey said.

Many others online echoed Sankey’s words as “Katrina” trended on Twitter, saying that the situation is similar to the hurricane disaster in that it has been downplayed by politicians but may be uncovered to be even more serious in the coming weeks.

Experts say that the power outages have partially been caused by the deregulation of the state’s electric grid. The government, some say, favored deregulatory actions like not requiring electrical equipment upgrades or proper weatherization, instead relying on free market mechanisms that ultimately contributed to the current disaster.

Former Gov. Rick Perry faced criticism on Wednesday when he said that Texans would rather face the current disaster than have to be regulated by the federal government. And he’s not the only Republican currently catching heat — many have begun calling for the resignation of Gov. Greg Abbott for a failure of leadership. On Wednesday, as millions suffered without power and under boil-water advisories, the governor went on Fox to attack clean energy, which experts say was not a major contributor to the current crisis, and the Green New Deal.

After declaring a state of emergency in the state over the weekend, the Joe Biden administration announced on Wednesday that it would be sending generators and other supplies to the state.

The freeze in Texas exposes America’s infrastructural failings (The Economist)

economist.com

Feb 17th 2021

You ain’t foolin’ nobody with the lights out

WHEN IT RAINS, it pours, and when it snows, the lights turn off. Or so it goes in Texas. After a winter storm pummelled the Lone Star State with record snowfall and the lowest temperatures in more than 30 years, millions were left without electricity and heat. On February 16th 4.5m Texan households were cut off from power, as providers were overloaded with demand and tried to shuffle access to electricity so the whole grid did not go down.

Whole skylines, including Dallas’s, went dark to conserve power. Some Texans braved the snowy roads to check into the few hotels with remaining rooms, only for the hotels’ power to go off as they arrived. Others donned skiwear and remained inside, hoping the lights and heat would come back on. Across the state, what were supposed to be “rolling” blackouts lasted for days. It is still too soon to quantify the devastation. More than 20 people have died in motor accidents, from fires lit for warmth and from carbon-monoxide poisoning from using cars for heat. The storm has also halted deliveries of covid-19 vaccines and may prevent around 1m vaccinations from happening this week. Several retail electricity providers are likely to go bankrupt, after being hit with surging wholesale power prices.

Other states, including Tennessee, were also covered in snow, but Texas got the lion’s share and ground to a halt. Texans are rightly furious that residents of America’s energy capital cannot count on reliable power. Everyone is asking why.

The short answer is that the Electric Reliability Council of Texas (ERCOT), which operates the grid, did not properly forecast the demand for energy as a result of the storm. Some say that this was nearly impossible to predict, but there were warnings of the severity of the coming weather in the preceding week, and ERCOT’s projections were notably short. Brownouts last summer had already demonstrated the grid’s lack of excess capacity, says George O’Leary of Tudor, Pickering, Holt & CO (TPH), an energy investment bank.

Many Republican politicians were quick to blame renewable energy sources, such as wind power, for the blackouts, but that is not fair. Some wind turbines did indeed freeze, but natural gas, which accounts for around half of the state’s electricity generation, was the primary source of the shortfall. Plants broke down, as did the gas supply chain and pipelines. The cold also caused a reactor at one of the state’s two nuclear plants to go offline. Transmission lines may have also iced up, says Wade Schauer of Wood Mackenzie, an energy-research firm. In short, Texas experienced a perfect storm.

Some of the blame falls on the unique design of the electricity market in Texas. Of America’s 48 contiguous states, it is the only one with its own stand-alone electricity grid—the Texas Interconnection. This means that when power generators fail, the state cannot import electricity from outside its borders.

The state’s deregulated power market is also fiercely competitive. ERCOT oversees the grid, while power generators produce electricity for the wholesale market. Some 300 retail electricity providers buy that fuel and then compete for consumers. Because such cold weather is rare, energy companies do not invest in “winterising” their equipment, as this would raise their prices for consumers. Perhaps most important, the state does not have a “capacity market”, which would ensure that there was extra power available for surging demand. This acts as a sort of insurance policy so the lights will not go out, but it also means customers pay higher bills.

For years the benefits of Texas’s deregulated market structure were clear. At 8.6 cents per kilowatt hour, the state’s average retail price for electricity is around one-fifth lower than the national average and about half the cost of California’s. In 1999 the state set targets for renewables, and today it accounts for around 30% of America’s wind energy.

This disaster is prompting people to question whether Texas’s system is as resilient and well-designed as people previously believed. Greg Abbott, the governor, has called for an investigation into ERCOT. This storm “has exposed some serious weaknesses in our free-market approach in Texas”, says Luke Metzger of Environment Texas, a non-profit, who had been without power for 54 hours when The Economist went to press.

Wholly redesigning the power grid in Texas seems unlikely. After the snow melts, the state will need to tackle two more straightforward questions. The first is whether it needs to increase reserve capacity. “If we impose a capacity market here and a bunch of new cap-ex is required to winterise equipment, who bears that cost? Ultimately it’s the customer,” says Bobby Tudor, chairman of TPH. The second is how Texas can ensure the reliability of equipment in extreme weather conditions. After a polar vortex in 2014 hit the east coast, PJM, a regional transmission organisation, started making higher payments based on reliability of service, says Michael Weinstein of Credit Suisse, a bank. In Texas there is no penalty for systems going down, except for public complaints and politicians’ finger-pointing.

Texas is hardly the only state to struggle with blackouts. California, which has a more tightly regulated power market, is regularly plunged into darkness during periods of high heat, winds and wildfires. Unlike Texas, much of northern California is dependent on a single utility, PG&E. The company has been repeatedly sued for dismal, dangerous management. But, as in Texas, critics have blamed intermittent renewable power for blackouts. In truth, California’s blackouts share many of the same causes as those in Texas: extreme weather, power generators that failed unexpectedly, poor planning by state regulators and an inability (in California, temporary) to import power from elsewhere. In California’s blackouts last year, solar output naturally declined in the evening. But gas plants also went offline and weak rainfall lowered the output of hydroelectric dams.

In California, as in Texas, it would help to have additional power generation, energy storage to meet peak demand and more resilient infrastructure, such as buried power lines and more long-distance, high-voltage transmission. Weather events that once might have been dismissed as unusual are becoming more common. Without more investment in electricity grids, blackouts will be, too.

A Glimpse of America’s Future: Climate Change Means Trouble for Power Grids (New York Times)

nytimes.com

Brad Plumer, Feb. 17, 2021


Systems are designed to handle spikes in demand, but the wild and unpredictable weather linked to global warming will very likely push grids beyond their limits.
A street in Austin, Texas, without power on Monday evening.
Credit: Tamir Kalifa for The New York Times

Published Feb. 16, 2021Updated Feb. 17, 2021, 6:59 a.m. ET

Huge winter storms plunged large parts of the central and southern United States into an energy crisis this week, with frigid blasts of Arctic weather crippling electric grids and leaving millions of Americans without power amid dangerously cold temperatures.

The grid failures were most severe in Texas, where more than four million people woke up Tuesday morning to rolling blackouts. Separate regional grids in the Southwest and Midwest also faced serious strain. As of Tuesday afternoon, at least 23 people nationwide had died in the storm or its aftermath.

Analysts have begun to identify key factors behind the grid failures in Texas. Record-breaking cold weather spurred residents to crank up their electric heaters and pushed power demand beyond the worst-case scenarios that grid operators had planned for. At the same time, a large fraction of the state’s gas-fired power plants were knocked offline amid icy conditions, with some plants suffering fuel shortages as natural gas demand spiked. Many of Texas’ wind turbines also froze and stopped working.

The crisis sounded an alarm for power systems throughout the country. Electric grids can be engineered to handle a wide range of severe conditions — as long as grid operators can reliably predict the dangers ahead. But as climate change accelerates, many electric grids will face extreme weather events that go far beyond the historical conditions those systems were designed for, putting them at risk of catastrophic failure.

While scientists are still analyzing what role human-caused climate change may have played in this week’s winter storms, it is clear that global warming poses a barrage of additional threats to power systems nationwide, including fiercer heat waves and water shortages.

Measures that could help make electric grids more robust — such as fortifying power plants against extreme weather, or installing more backup power sources — could prove expensive. But as Texas shows, blackouts can be extremely costly, too. And, experts said, unless grid planners start planning for increasingly wild and unpredictable climate conditions, grid failures will happen again and again.

“It’s essentially a question of how much insurance you want to buy,” said Jesse Jenkins, an energy systems engineer at Princeton University. “What makes this problem even harder is that we’re now in a world where, especially with climate change, the past is no longer a good guide to the future. We have to get much better at preparing for the unexpected.”

Texas’ main electric grid, which largely operates independently from the rest of the country, has been built with the state’s most common weather extremes in mind: soaring summer temperatures that cause millions of Texans to turn up their air-conditioners all at once.

While freezing weather is rarer, grid operators in Texas have also long known that electricity demand can spike in the winter, particularly after damaging cold snaps in 2011 and 2018. But this week’s winter storms, which buried the state in snow and ice, and led to record-cold temperatures, surpassed all expectations — and pushed the grid to its breaking point.

Residents of East Dallas trying to warm up on Monday after their family home lost power.
Credit: Juan Figueroa/The Dallas Morning News, via Associated Press

Texas’ grid operators had anticipated that, in the worst case, the state would use 67 gigawatts of electricity during the winter peak. But by Sunday evening, power demand had surged past that level. As temperatures dropped, many homes were relying on older, inefficient electric heaters that consume more power.

The problems compounded from there, with frigid weather on Monday disabling power plants with capacity totaling more than 30 gigawatts. The vast majority of those failures occurred at thermal power plants, like natural gas generators, as plummeting temperatures paralyzed plant equipment and soaring demand for natural gas left some plants struggling to obtain sufficient fuel. A number of the state’s power plants were also offline for scheduled maintenance in preparation for the summer peak.

The state’s fleet of wind farms also lost up to 4.5 gigawatts of capacity at times, as many turbines stopped working in cold and icy conditions, though this was a smaller part of the problem.

In essence, experts said, an electric grid optimized to deliver huge quantities of power on the hottest days of the year was caught unprepared when temperatures plummeted.

While analysts are still working to untangle all of the reasons behind Texas’ grid failures, some have also wondered whether the unique way the state manages its largely deregulated electricity system may have played a role. In the mid-1990s, for instance, Texas decided against paying energy producers to hold a fixed number of backup power plants in reserve, instead letting market forces dictate what happens on the grid.

On Tuesday, Gov. Greg Abbott called for an emergency reform of the Electric Reliability Council of Texas, the nonprofit corporation that oversees the flow of power in the state, saying its performance had been “anything but reliable” over the previous 48 hours.

In theory, experts said, there are technical solutions that can avert such problems.

Wind turbines can be equipped with heaters and other devices so that they can operate in icy conditions — as is often done in the upper Midwest, where cold weather is more common. Gas plants can be built to store oil on-site and switch over to burning the fuel if needed, as is often done in the Northeast, where natural gas shortages are common. Grid regulators can design markets that pay extra to keep a larger fleet of backup power plants in reserve in case of emergencies, as is done in the Mid-Atlantic.

But these solutions all cost money, and grid operators are often wary of forcing consumers to pay extra for safeguards.

“Building in resilience often comes at a cost, and there’s a risk of both underpaying but also of overpaying,” said Daniel Cohan, an associate professor of civil and environmental engineering at Rice University. “It’s a difficult balancing act.”

In the months ahead, as Texas grid operators and policymakers investigate this week’s blackouts, they will likely explore how the grid might be bolstered to handle extremely cold weather. Some possible ideas include: Building more connections between Texas and other states to balance electricity supplies, a move the state has long resisted; encouraging homeowners to install battery backup systems; or keeping additional power plants in reserve.

The search for answers will be complicated by climate change. Over all, the state is getting warmer as global temperatures rise, and cold-weather extremes are, on average, becoming less common over time.

But some climate scientists have also suggested that global warming could, paradoxically, bring more unusually fierce winter storms. Some research indicates that Arctic warming is weakening the jet stream, the high-level air current that circles the northern latitudes and usually holds back the frigid polar vortex. This can allow cold air to periodically escape to the South, resulting in episodes of bitter cold in places that rarely get nipped by frost.

ImageCredit: Jacob Ford/Odessa American, via Associated Press

But this remains an active area of debate among climate scientists, with some experts less certain that polar vortex disruptions are becoming more frequent, making it even trickier for electricity planners to anticipate the dangers ahead.

All over the country, utilities and grid operators are confronting similar questions, as climate change threatens to intensify heat waves, floods, water shortages and other calamities, all of which could create novel risks for the nation’s electricity systems. Adapting to those risks could carry a hefty price tag: One recent study found that the Southeast alone may need 35 percent more electric capacity by 2050 simply to deal with the known hazards of climate change.

And the task of building resilience is becoming increasingly urgent. Many policymakers are promoting electric cars and electric heating as a way of curbing greenhouse gas emissions. But as more of the nation’s economy depends on reliable flows of electricity, the cost of blackouts will become ever more dire.

“This is going to be a significant challenge,” said Emily Grubert, an infrastructure expert at Georgia Tech. “We need to decarbonize our power systems so that climate change doesn’t keep getting worse, but we also need to adapt to changing conditions at the same time. And the latter alone is going to be very costly. We can already see that the systems we have today aren’t handling this very well.”

John Schwartz, Dave Montgomery and Ivan Penn contributed reporting.

Cálculos mostram que será impossível controlar uma Inteligência Artificial super inteligente (Engenharia é:)

engenhariae.com.br

Ademilson Ramos, 23 de janeiro de 2021


Foto de Alex Knight no Unsplash

A ideia da inteligência artificial derrubar a humanidade tem sido discutida por muitas décadas, e os cientistas acabaram de dar seu veredicto sobre se seríamos capazes de controlar uma superinteligência de computador de alto nível. A resposta? Quase definitivamente não.

O problema é que controlar uma superinteligência muito além da compreensão humana exigiria uma simulação dessa superinteligência que podemos analisar. Mas se não formos capazes de compreendê-lo, é impossível criar tal simulação.

Regras como ‘não causar danos aos humanos’ não podem ser definidas se não entendermos o tipo de cenário que uma IA irá criar, sugerem os pesquisadores. Uma vez que um sistema de computador está trabalhando em um nível acima do escopo de nossos programadores, não podemos mais estabelecer limites.

“Uma superinteligência apresenta um problema fundamentalmente diferente daqueles normalmente estudados sob a bandeira da ‘ética do robô’”, escrevem os pesquisadores.

“Isso ocorre porque uma superinteligência é multifacetada e, portanto, potencialmente capaz de mobilizar uma diversidade de recursos para atingir objetivos que são potencialmente incompreensíveis para os humanos, quanto mais controláveis.”

Parte do raciocínio da equipe vem do problema da parada apresentado por Alan Turing em 1936. O problema centra-se em saber se um programa de computador chegará ou não a uma conclusão e responderá (para que seja interrompido), ou simplesmente ficar em um loop eterno tentando encontrar uma.

Como Turing provou por meio de uma matemática inteligente, embora possamos saber isso para alguns programas específicos, é logicamente impossível encontrar uma maneira que nos permita saber isso para cada programa potencial que poderia ser escrito. Isso nos leva de volta à IA, que, em um estado superinteligente, poderia armazenar todos os programas de computador possíveis em sua memória de uma vez.

Qualquer programa escrito para impedir que a IA prejudique humanos e destrua o mundo, por exemplo, pode chegar a uma conclusão (e parar) ou não – é matematicamente impossível para nós estarmos absolutamente seguros de qualquer maneira, o que significa que não pode ser contido.

“Na verdade, isso torna o algoritmo de contenção inutilizável”, diz o cientista da computação Iyad Rahwan, do Instituto Max-Planck para o Desenvolvimento Humano, na Alemanha.

A alternativa de ensinar alguma ética à IA e dizer a ela para não destruir o mundo – algo que nenhum algoritmo pode ter certeza absoluta de fazer, dizem os pesquisadores – é limitar as capacidades da superinteligência. Ele pode ser cortado de partes da Internet ou de certas redes, por exemplo.

O novo estudo também rejeita essa ideia, sugerindo que isso limitaria o alcance da inteligência artificial – o argumento é que se não vamos usá-la para resolver problemas além do escopo dos humanos, então por que criá-la?

Se vamos avançar com a inteligência artificial, podemos nem saber quando chega uma superinteligência além do nosso controle, tal é a sua incompreensibilidade. Isso significa que precisamos começar a fazer algumas perguntas sérias sobre as direções que estamos tomando.

“Uma máquina superinteligente que controla o mundo parece ficção científica”, diz o cientista da computação Manuel Cebrian, do Instituto Max-Planck para o Desenvolvimento Humano. “Mas já existem máquinas que executam certas tarefas importantes de forma independente, sem que os programadores entendam totalmente como as aprenderam.”

“Portanto, surge a questão de saber se isso poderia em algum momento se tornar incontrolável e perigoso para a humanidade.”

A pesquisa foi publicada no Journal of Artificial Intelligence Research.

Developing Algorithms That Might One Day Be Used Against You (Gizmodo)

gizmodo.com

Ryan F. Mandelbaum, Jan 24, 2021


Brian Nord is an astrophysicist and machine learning researcher.
Brian Nord is an astrophysicist and machine learning researcher. Photo: Mark Lopez/Argonne National Laboratory

Machine learning algorithms serve us the news we read, the ads we see, and in some cases even drive our cars. But there’s an insidious layer to these algorithms: They rely on data collected by and about humans, and they spit our worst biases right back out at us. For example, job candidate screening algorithms may automatically reject names that sound like they belong to nonwhite people, while facial recognition software is often much worse at recognizing women or nonwhite faces than it is at recognizing white male faces. An increasing number of scientists and institutions are waking up to these issues, and speaking out about the potential for AI to cause harm.

Brian Nord is one such researcher weighing his own work against the potential to cause harm with AI algorithms. Nord is a cosmologist at Fermilab and the University of Chicago, where he uses artificial intelligence to study the cosmos, and he’s been researching a concept for a “self-driving telescope” that can write and test hypotheses with the help of a machine learning algorithm. At the same time, he’s struggling with the idea that the algorithms he’s writing may one day be biased against him—and even used against him—and is working to build a coalition of physicists and computer scientists to fight for more oversight in AI algorithm development.

This interview has been edited and condensed for clarity.

Gizmodo: How did you become a physicist interested in AI and its pitfalls?

Brian Nord: My Ph.d is in cosmology, and when I moved to Fermilab in 2012, I moved into the subfield of strong gravitational lensing. [Editor’s note: Gravitational lenses are places in the night sky where light from distant objects has been bent by the gravitational field of heavy objects in the foreground, making the background objects appear warped and larger.] I spent a few years doing strong lensing science in the traditional way, where we would visually search through terabytes of images, through thousands of candidates of these strong gravitational lenses, because they’re so weird, and no one had figured out a more conventional algorithm to identify them. Around 2015, I got kind of sad at the prospect of only finding these things with my eyes, so I started looking around and found deep learning.

Here we are a few years later—myself and a few other people popularized this idea of using deep learning—and now it’s the standard way to find these objects. People are unlikely to go back to using methods that aren’t deep learning to do galaxy recognition. We got to this point where we saw that deep learning is the thing, and really quickly saw the potential impact of it across astronomy and the sciences. It’s hitting every science now. That is a testament to the promise and peril of this technology, with such a relatively simple tool. Once you have the pieces put together right, you can do a lot of different things easily, without necessarily thinking through the implications.

Gizmodo: So what is deep learning? Why is it good and why is it bad?

BN: Traditional mathematical models (like the F=ma of Newton’s laws) are built by humans to describe patterns in data: We use our current understanding of nature, also known as intuition, to choose the pieces, the shape of these models. This means that they are often limited by what we know or can imagine about a dataset. These models are also typically smaller and are less generally applicable for many problems.

On the other hand, artificial intelligence models can be very large, with many, many degrees of freedom, so they can be made very general and able to describe lots of different data sets. Also, very importantly, they are primarily sculpted by the data that they are exposed to—AI models are shaped by the data with which they are trained. Humans decide what goes into the training set, which is then limited again by what we know or can imagine about that data. It’s not a big jump to see that if you don’t have the right training data, you can fall off the cliff really quickly.

The promise and peril are highly related. In the case of AI, the promise is in the ability to describe data that humans don’t yet know how to describe with our ‘intuitive’ models. But, perilously, the data sets used to train them incorporate our own biases. When it comes to AI recognizing galaxies, we’re risking biased measurements of the universe. When it comes to AI recognizing human faces, when our data sets are biased against Black and Brown faces for example, we risk discrimination that prevents people from using services, that intensifies surveillance apparatus, that jeopardizes human freedoms. It’s critical that we weigh and address these consequences before we imperil people’s lives with our research.

Gizmodo: When did the light bulb go off in your head that AI could be harmful?

BN: I gotta say that it was with the Machine Bias article from ProPublica in 2016, where they discuss recidivism and sentencing procedure in courts. At the time of that article, there was a closed-source algorithm used to make recommendations for sentencing, and judges were allowed to use it. There was no public oversight of this algorithm, which ProPublica found was biased against Black people; people could use algorithms like this willy nilly without accountability. I realized that as a Black man, I had spent the last few years getting excited about neural networks, then saw it quite clearly that these applications that could harm me were already out there, already being used, and we’re already starting to become embedded in our social structure through the criminal justice system. Then I started paying attention more and more. I realized countries across the world were using surveillance technology, incorporating machine learning algorithms, for widespread oppressive uses.

Gizmodo: How did you react? What did you do?

BN: I didn’t want to reinvent the wheel; I wanted to build a coalition. I started looking into groups like Fairness, Accountability and Transparency in Machine Learning, plus Black in AI, who is focused on building communities of Black researchers in the AI field, but who also has the unique awareness of the problem because we are the people who are affected. I started paying attention to the news and saw that Meredith Whittaker had started a think tank to combat these things, and Joy Buolamwini had helped found the Algorithmic Justice League. I brushed up on what computer scientists were doing and started to look at what physicists were doing, because that’s my principal community.

It became clear to folks like me and Savannah Thais that physicists needed to realize that they have a stake in this game. We get government funding, and we tend to take a fundamental approach to research. If we bring that approach to AI, then we have the potential to affect the foundations of how these algorithms work and impact a broader set of applications. I asked myself and my colleagues what our responsibility in developing these algorithms was and in having some say in how they’re being used down the line.

Gizmodo: How is it going so far?

BN: Currently, we’re going to write a white paper for SNOWMASS, this high-energy physics event. The SNOWMASS process determines the vision that guides the community for about a decade. I started to identify individuals to work with, fellow physicists, and experts who care about the issues, and develop a set of arguments for why physicists from institutions, individuals, and funding agencies should care deeply about these algorithms they’re building and implementing so quickly. It’s a piece that’s asking people to think about how much they are considering the ethical implications of what they’re doing.

We’ve already held a workshop at the University of Chicago where we’ve begun discussing these issues, and at Fermilab we’ve had some initial discussions. But we don’t yet have the critical mass across the field to develop policy. We can’t do it ourselves as physicists; we don’t have backgrounds in social science or technology studies. The right way to do this is to bring physicists together from Fermilab and other institutions with social scientists and ethicists and science and technology studies folks and professionals, and build something from there. The key is going to be through partnership with these other disciplines.

Gizmodo: Why haven’t we reached that critical mass yet?

BN: I think we need to show people, as Angela Davis has said, that our struggle is also their struggle. That’s why I’m talking about coalition building. The thing that affects us also affects them. One way to do this is to clearly lay out the potential harm beyond just race and ethnicity. Recently, there was this discussion of a paper that used neural networks to try and speed up the selection of candidates for Ph.D programs. They trained the algorithm on historical data. So let me be clear, they said here’s a neural network, here’s data on applicants who were denied and accepted to universities. Those applicants were chosen by faculty and people with biases. It should be obvious to anyone developing that algorithm that you’re going to bake in the biases in that context. I hope people will see these things as problems and help build our coalition.

Gizmodo: What is your vision for a future of ethical AI?

BN: What if there were an agency or agencies for algorithmic accountability? I could see these existing at the local level, the national level, and the institutional level. We can’t predict all of the future uses of technology, but we need to be asking questions at the beginning of the processes, not as an afterthought. An agency would help ask these questions and still allow the science to get done, but without endangering people’s lives. Alongside agencies, we need policies at various levels that make a clear decision about how safe the algorithms have to be before they are used on humans or other living things. If I had my druthers, these agencies and policies would be built by an incredibly diverse group of people. We’ve seen instances where a homogeneous group develops an app or technology and didn’t see the things that another group who’s not there would have seen. We need people across the spectrum of experience to participate in designing policies for ethical AI.

Gizmodo: What are your biggest fears about all of this?

BN: My biggest fear is that people who already have access to technology resources will continue to use them to subjugate people who are already oppressed; Pratyusha Kalluri has also advanced this idea of power dynamics. That’s what we’re seeing across the globe. Sure, there are cities that are trying to ban facial recognition, but unless we have a broader coalition, unless we have more cities and institutions willing to take on this thing directly, we’re not going to be able to keep this tool from exacerbating white supremacy, racism, and misogyny that that already exists inside structures today. If we don’t push policy that puts the lives of marginalized people first, then they’re going to continue being oppressed, and it’s going to accelerate.

Gizmodo: How has thinking about AI ethics affected your own research?

BN: I have to question whether I want to do AI work and how I’m going to do it; whether or not it’s the right thing to do to build a certain algorithm. That’s something I have to keep asking myself… Before, it was like, how fast can I discover new things and build technology that can help the world learn something? Now there’s a significant piece of nuance to that. Even the best things for humanity could be used in some of the worst ways. It’s a fundamental rethinking of the order of operations when it comes to my research.

I don’t think it’s weird to think about safety first. We have OSHA and safety groups at institutions who write down lists of things you have to check off before you’re allowed to take out a ladder, for example. Why are we not doing the same thing in AI? A part of the answer is obvious: Not all of us are people who experience the negative effects of these algorithms. But as one of the few Black people at the institutions I work in, I’m aware of it, I’m worried about it, and the scientific community needs to appreciate that my safety matters too, and that my safety concerns don’t end when I walk out of work.

Gizmodo: Anything else?

BN: I’d like to re-emphasize that when you look at some of the research that has come out, like vetting candidates for graduate school, or when you look at the biases of the algorithms used in criminal justice, these are problems being repeated over and over again, with the same biases. It doesn’t take a lot of investigation to see that bias enters these algorithms very quickly. The people developing them should really know better. Maybe there needs to be more educational requirements for algorithm developers to think about these issues before they have the opportunity to unleash them on the world.

This conversation needs to be raised to the level where individuals and institutions consider these issues a priority. Once you’re there, you need people to see that this is an opportunity for leadership. If we can get a grassroots community to help an institution to take the lead on this, it incentivizes a lot of people to start to take action.

And finally, people who have expertise in these areas need to be allowed to speak their minds. We can’t allow our institutions to quiet us so we can’t talk about the issues we’re bringing up. The fact that I have experience as a Black man doing science in America, and the fact that I do AI—that should be appreciated by institutions. It gives them an opportunity to have a unique perspective and take a unique leadership position. I would be worried if individuals felt like they couldn’t speak their mind. If we can’t get these issues out into the sunlight, how will we be able to build out of the darkness?

Ryan F. Mandelbaum – Former Gizmodo physics writer and founder of Birdmodo, now a science communicator specializing in quantum computing and birds

Opportunity and risk in the nature-based bioeconomy (SciDevNet)

16/11/20

Farmers tending their plants. But Chatham House’s Patrick Schröder warns that green isn’t always sustainable. Farmers tending their plants. But Chatham House’s Patrick Schröder warns that green isn’t always sustainable. Copyright: USAID/Natasha Murigu/(CC BY-NC 2.0)

Speed read

  • ‘Green’ doesn’t always mean ‘sustainable’, says circular economy specialist
  • Bioeconomy an essential part of the global economy
  • But, has potential to further degrade the environment

By: Patrick Schröder

Green isn’t always sustainable, Chatham House’s Patrick Schröder warns as the Global Bioeconomy Summit kicks off.

All that glitters is not gold, or so the expression goes. Similarly, as business leaders, academics, and policymakers gather for the third Global Bioeconomy Summit it’s worth noting that all that’s green is not necessarily sustainable.

The ‘bioeconomy’ is a sophisticated sounding term, but essentially it means the things we make, use and sell that have their origins in nature; and the aim is to transition the economy from fossil resources towards renewable ones. Farming and forestry are part of the bioeconomy, as is energy produced from biomass, and services like tourism that are rooted in nature and outdoor experiences. The bioeconomy is central to what we do every day, and is an essential part of the global economy. In Europe alone the bioeconomy has an annual value of €2.4 trillion. It holds the key to a greener, more sustainable and healthy future for all — if the right practices, regulations and incentives are in place.

“Governments have the choice to use the bioeconomy as a source of regenerative and sustainable development that upholds the rights of citizens and protects crucial ecological systems.” – Patrick Schröder

At the same time, the bioeconomy has the potential to drive further environmental destruction and degradation. Irresponsible pursuit of profit and unsustainable exploitation of natural resources are making climate change, biodiversity loss, infectious diseases, hunger and inequality much worse. A recent report from the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES) found that unless we dramatically reduce our impact on the natural world, future pandemics will become more frequent, spread more quickly and kill more people. 

High levels of consumption in industrialised countries have far-reaching impacts on ecosystems, food security and human rights both within and beyond their borders. Low- and middle-income countries are directly affected by the policies and practices of the global North, and ordinary citizens have limited influence. Demand in the United States and the United Kingdom for beef directly drives deforestation in the Amazon; while the number of everyday products that contain unsustainable palm oil continues to increase.

Sustainability challenge

An unsustainable bioeconomy also threatens the achievement of the Sustainable Development Goals (SDGs) — a global sustainability framework adopted by the United Nations in 2015. A recent report by the German Federal Environment Agency found that in order for the bioeconomy to work for, rather than against, the SDGs, the global agenda and national strategies need to focus much more on restoration of ecosystems, sustainable land-use, climate protection and food sovereignty.

Forests are a key testing ground. These ecosystems have a huge and positive impact on biodiversity, conservation and climate, and provide livelihoods and a place to live for millions of people. But they face an existential threat from unsustainable economic activity. For example, large-scale bioenergy production in Latin America and the Caribbean, where forests cover almost half of all land area, is competing for space with farming of monoculture crops for export, with serious consequences for biodiversity and food security of smallholders.

We’ll hear lots about the potential of the bioeconomy for delivering sustainable economic growth during this week’s summit, but we should be sceptical about the sustainability credentials of a system with a track record of pushing marginalised communities and vulnerable ecosystems to the limit. Without better governance and institutional frameworks, the bioeconomy will only exacerbate social and environmental problems.

Three steps

At Chatham House, we’ve been looking into how to make the bioeconomy more sustainable. Our research suggests that there are three things that will drive better outcomes.

First, we need to bring in new voices. Currently, bioeconomy policy processes are dominated by industry, science and a small circle of political actors. There’s an unequal agenda here: those most affected by the policies are rarely the ones shaping them. Greater efforts to include civil society and a wider range of government departments in decision making will encourage the bioeconomy to work for a larger group of people.

Second, governments need to use the right governance mechanisms. So far, only a small number of countries with national bioeconomy strategies consider the potential negative impacts and environmental risks. The development of the bioeconomy needs to align with existing international governance and support mechanisms for sustainable land-use, soil protection and forest conservation. The UN Biodiversity Conference, due to take place in China next year, will be important for establishing cohesive governance mechanisms and regulations to prevent trade-offs between the bioeconomy and biodiversity protection.

Third, the bioeconomy should embrace circular principles. Much of the bioeconomy is based on our current linear model of ‘take–make–throw away’, where resources are extracted, turned into products, consumed and discarded. This is fundamentally unsustainable. In a circular bioeconomy the cascading use principle is applied to biomass resources, such as wood and agricultural products. This approach gives priority to processes that allow the reuse and recycling of products and raw materials. It increases the productivity and efficient use of scarce and valuable raw material resources.

Governments have the choice to use the bioeconomy as a source of regenerative and sustainable development that upholds the rights of citizens and protects crucial ecological systems. Or, they can allow the evolution of a system that is just as exploitative, unsustainable and profit-driven as other parts of the economy.

Delegates at this week’s summit should think hard about the actions they can take to ensure the growing bioeconomy fulfils its promise to serve the needs of people and planet, and help deliver on the Sustainable Development Goals.

Patrick Schröder is a senior research fellow in Chatham House’s energy, environment and resources programme. He specialises in the circular economy and resource governance in developing countries.

Papa Francisco pede orações para robôs e IA (Tecmundo)

11/11/2020 às 18:30 1 min de leitura

Imagem de: Papa Francisco pede orações para robôs e IA

Jorge Marin

O Papa Francisco pediu aos fiéis do mundo inteiro para que, durante o mês de novembro, rezem para que o progresso da robótica e da inteligência artificial (IA) possam sempre servir a humanidade.

A mensagem faz parte de uma série de intenções de oração que o pontífice divulga anualmente, e compartilha a cada mês no YouTube para auxiliar os católicos a “aprofundar sua oração diária”, concentrando-se em tópicos específicos. Em setembro, o papa pediu orações para o “compartilhamento dos recursos do planeta”; em agosto, para o “mundo marítimo”; e agora chegou a vez dos robôs e da IA.

Na sua mensagem, o Papa Francisco pediu uma atenção especial para a IA que, segundo ele, está “no centro da mudança histórica que estamos experimentando”. E que não se trata apenas dos benefícios que a robótica pode trazer para o mundo.

Progresso tecnológico e algoritmos

Francisco afirma que nem sempre o progresso tecnológico é sinal de bem-estar para a humanidade, pois, se esse progresso contribuir para aumentar as desigualdades, não poderá ser considerado como um progresso verdadeiro. “Os avanços futuros devem ser orientados para o respeito à dignidade da pessoa”, alerta o papa.

A preocupação com que a tecnologia possa aumentar as divisões sociais já existentes levou o Vaticano assinar no início deste ano, em conjunto com a Microsoft e a IBM, a “Chamada de Roma por Ética de IA”, um documento em que são fixados alguns princípios para orientar a implantação da IA: transparência, inclusão, imparcialidade e confiabilidade.

Mesmo pessoas não religiosas são capazes de reconhecer que, quando se trata de implantar algoritmos, a preocupação do papa faz todo o sentido.

Inner Workings: Crop researchers harness artificial intelligence to breed crops for the changing climate (PNAS)

Carolyn Beans PNAS November 3, 2020 117 (44) 27066-27069; first published October 14, 2020; https://doi.org/10.1073/pnas.2018732117

Until recently, the field of plant breeding looked a lot like it did in centuries past. A breeder might examine, for example, which tomato plants were most resistant to drought and then cross the most promising plants to produce the most drought-resistant offspring. This process would be repeated, plant generation after generation, until, over the course of roughly seven years, the breeder arrived at what seemed the optimal variety.

Figure1
Researchers at ETH Zürich use standard color images and thermal images collected by drone to determine how plots of wheat with different genotypes vary in grain ripeness. Image credit: Norbert Kirchgessner (ETH Zürich, Zürich, Switzerland).

Now, with the global population expected to swell to nearly 10 billion by 2050 (1) and climate change shifting growing conditions (2), crop breeder and geneticist Steven Tanksley doesn’t think plant breeders have that kind of time. “We have to double the productivity per acre of our major crops if we’re going to stay on par with the world’s needs,” says Tanksley, a professor emeritus at Cornell University in Ithaca, NY.

To speed up the process, Tanksley and others are turning to artificial intelligence (AI). Using computer science techniques, breeders can rapidly assess which plants grow the fastest in a particular climate, which genes help plants thrive there, and which plants, when crossed, produce an optimum combination of genes for a given location, opting for traits that boost yield and stave off the effects of a changing climate. Large seed companies in particular have been using components of AI for more than a decade. With computing power rapidly advancing, the techniques are now poised to accelerate breeding on a broader scale.

AI is not, however, a panacea. Crop breeders still grapple with tradeoffs such as higher yield versus marketable appearance. And even the most sophisticated AI cannot guarantee the success of a new variety. But as AI becomes integrated into agriculture, some crop researchers envisage an agricultural revolution with computer science at the helm.

An Art and a Science

During the “green revolution” of the 1960s, researchers developed new chemical pesticides and fertilizers along with high-yielding crop varieties that dramatically increased agricultural output (3). But the reliance on chemicals came with the heavy cost of environmental degradation (4). “If we’re going to do this sustainably,” says Tanksley, “genetics is going to carry the bulk of the load.”

Plant breeders lean not only on genetics but also on mathematics. As the genomics revolution unfolded in the early 2000s, plant breeders found themselves inundated with genomic data that traditional statistical techniques couldn’t wrangle (5). Plant breeding “wasn’t geared toward dealing with large amounts of data and making precise decisions,” says Tanksley.

In 1997, Tanksley began chairing a committee at Cornell that aimed to incorporate data-driven research into the life sciences. There, he encountered an engineering approach called operations research that translates data into decisions. In 2006, Tanksley cofounded the Ithaca, NY-based company Nature Source Improved Plants on the principle that this engineering tool could make breeding decisions more efficient. “What we’ve been doing almost 15 years now,” says Tanksley, “is redoing how breeding is approached.”

A Manufacturing Process

Such approaches try to tackle complex scenarios. Suppose, for example, a wheat breeder has 200 genetically distinct lines. The breeder must decide which lines to breed together to optimize yield, disease resistance, protein content, and other traits. The breeder may know which genes confer which traits, but it’s difficult to decipher which lines to cross in what order to achieve the optimum gene combination. The number of possible combinations, says Tanksley, “is more than the stars in the universe.”

An operations research approach enables a researcher to solve this puzzle by defining the primary objective and then using optimization algorithms to predict the quickest path to that objective given the relevant constraints. Auto manufacturers, for example, optimize production given the expense of employees, the cost of auto parts, and fluctuating global currencies. Tanksley’s team optimizes yield while selecting for traits such as resistance to a changing climate. “We’ve seen more erratic climate from year to year, which means you have to have crops that are more robust to different kinds of changes,” he says.

For each plant line included in a pool of possible crosses, Tanksley inputs DNA sequence data, phenotypic data on traits like drought tolerance, disease resistance, and yield, as well as environmental data for the region where the plant line was originally developed. The algorithm projects which genes are associated with which traits under which environmental conditions and then determines the optimal combination of genes for a specific breeding goal, such as drought tolerance in a particular growing region, while accounting for genes that help boost yield. The algorithm also determines which plant lines to cross together in which order to achieve the optimal combination of genes in the fewest generations.

Nature Source Improved Plants conducts, for example, a papaya program in southeastern Mexico where the once predictable monsoon season has become erratic. “We are selecting for varieties that can produce under those unknown circumstances,” says Tanksley. But the new papaya must also stand up to ringspot, a virus that nearly wiped papaya from Hawaii altogether before another Cornell breeder developed a resistant transgenic variety (6). Tanksley’s papaya isn’t as disease resistant. But by plugging “rapid growth rate” into their operations research approach, the team bred papaya trees that produce copious fruit within a year, before the virus accumulates in the plant.

“Plant breeders need operations research to help them make better decisions,” says William Beavis, a plant geneticist and computational biologist at Iowa State in Ames, who also develops operations research strategies for plant breeding. To feed the world in rapidly changing environments, researchers need to shorten the process of developing a new cultivar to three years, Beavis adds.

The big seed companies have investigated use of operations research since around 2010, with Syngenta, headquartered in Basel, Switzerland, leading the pack, says Beavis, who spent over a decade as a statistical geneticist at Pioneer Hi-Bred in Johnston, IA, a large seed company now owned by Corteva, which is headquartered in Wilmington, DE. “All of the soybean varieties that have come on the market within the last couple of years from Syngenta came out of a system that had been redesigned using operations research approaches,” he says. But large seed companies primarily focus on grains key to animal feed such as corn, wheat, and soy. To meet growing food demands, Beavis believes that the smaller seed companies that develop vegetable crops that people actually eat must also embrace operations research. “That’s where operations research is going to have the biggest impact,” he says, “local breeding companies that are producing for regional environments, not for broad adaptation.”

In collaboration with Iowa State colleague and engineer Lizhi Wang and others, Beavis is developing operations research-based algorithms to, for example, help seed companies choose whether to breed one variety that can survive in a range of different future growing conditions or a number of varieties, each tailored to specific environments. Two large seed companies, Corteva and Syngenta, and Kromite, a Lambertville, NJ-based consulting company, are partners on the project. The results will be made publicly available so that all seed companies can learn from their approach.

Figure2
Nature Source Improved Plants (NSIP) speeds up its papaya breeding program in southeastern Mexico by using decision-making approaches more common in engineering. Image credit: Nature Source Improved Plants/Jesús Morales.

Drones and Adaptations

Useful farming AI requires good data, and plenty of it. To collect sufficient inputs, some researchers take to the skies. Crop researcher Achim Walter of the Institute of Agricultural Sciences at ETH Zürich in Switzerland and his team are developing techniques to capture aerial crop images. Every other day for several years, they have deployed image-capturing sensors over a wheat field containing hundreds of genetic lines. They fly their sensors on drones or on cables suspended above the crops or incorporate them into handheld devices that a researcher can use from an elevated platform (7).

Meanwhile, they’re developing imaging software that quantifies growth rate captured by these images (8). Using these data, they build models that predict how quickly different genetic lines grow under different weather conditions. If they find, for example, that a subset of wheat lines grew well despite a dry spell, then they can zero in on the genes those lines have in common and incorporate them into new drought-resistant varieties.

Research geneticist Edward Buckler at the US Department of Agriculture and his team are using machine learning to identify climate adaptations in 1,000 species in a large grouping of grasses spread across the globe. The grasses include food and bioenergy crops such as maize, sorghum, and sugar cane. Buckler says that when people rank what are the most photosynthetically efficient and water-efficient species, this is the group that comes out at the top. Still, he and collaborators, including plant scientist Elizabeth Kellogg of the Donald Danforth Plant Science Center in St. Louis, MO, and computational biologist Adam Siepel of Cold Spring Harbor Laboratory in NY, want to uncover genes that could make crops in this group even more efficient for food production in current and future environments. The team is first studying a select number of model species to determine which genes are expressed under a range of different environmental conditions. They’re still probing just how far this predictive power can go.

Such approaches could be scaled up—massively. To probe the genetic underpinnings of climate adaptation for crop species worldwide, Daniel Jacobson, the chief researcher for computational systems biology at Oak Ridge National Laboratory in TN, has amassed “climatype” data for every square kilometer of land on Earth. Using the Summit supercomputer, they then compared each square kilometer to every other square kilometer to identify similar environments (9). The result can be viewed as a network of GPS points connected by lines that show the degree of environmental similarity between points.

“For me, breeding is much more like art. I need to see the variation and I don’t prejudge it. I know what I’m after, but nature throws me curveballs all the time, and I probably can’t count the varieties that came from curveballs.”

—Molly Jahn

In collaboration with the US Department of Energy’s Center for Bioenergy Innovation, the team combines this climatype data with GPS coordinates associated with individual crop genotypes to project which genes and genetic interactions are associated with specific climate conditions. Right now, they’re focused on bioenergy and feedstocks, but they’re poised to explore a wide range of food crops as well. The results will be published so that other researchers can conduct similar analyses.

The Next Agricultural Revolution

Despite these advances, the transition to AI can be unnerving. Operations research can project an ideal combination of genes, but those genes may interact in unpredictable ways. Tanksley’s company hedges its bets by engineering 10 varieties for a given project in hopes that at least one will succeed.

On the other hand, such a directed approach could miss happy accidents, says Molly Jahn, a geneticist and plant breeder at the University of Wisconsin–Madison. “For me, breeding is much more like art. I need to see the variation and I don’t prejudge it,” she says. “I know what I’m after, but nature throws me curveballs all the time, and I probably can’t count the varieties that came from curveballs.”

There are also inherent tradeoffs that no algorithm can overcome. Consumers may prefer tomatoes with a leafy crown that stays green longer. But the price a breeder pays for that green calyx is one percent of the yield, says Tanksley.

Image recognition technology comes with its own host of challenges, says Walter. “To optimize algorithms to an extent that makes it possible to detect a certain trait, you have to train the algorithm thousands of times.” In practice, that means snapping thousands of crop images in a range of light conditions. Then there’s the ground-truthing. To know whether the models work, Walter and others must measure the trait they’re after by hand. Keen to know whether the model accurately captures the number of kernels on an ear of corn? You’d have to count the kernels yourself.

Despite these hurdles, Walter believes that computer science has brought us to the brink of a new agricultural revolution. In a 2017 PNAS Opinion piece, Walter and colleagues described emerging “smart farming” technologies—from autonomous weeding vehicles to moisture sensors in the soil (10). The authors worried, though, that only big industrial farms can afford these solutions. To make agriculture more sustainable, smaller farms in developing countries must have access as well.

Fortunately, “smart breeding” advances may have wider reach. Once image recognition technology becomes more developed for crops, which Walter expects will happen within the next 10 years, deploying it may be relatively inexpensive. Breeders could operate their own drones and obtain more precise ratings of traits like time to flowering or number of fruits in shorter time, says Walter. “The computing power that you need once you have established the algorithms is not very high.”

The genomic data so vital to AI-led breeding programs is also becoming more accessible. “We’re really at this point where genomics is cheap enough that you can apply these technologies to hundreds of species, maybe thousands,” says Buckler.

Plant breeding has “entered the engineered phase,” adds Tanksley. And with little time to spare. “The environment is changing,” he says. “You have to have a faster breeding process to respond to that.”

Published under the PNAS license.

References

1. United Nations, Department of Economic and Social Affairs, Population Division, World Population Prospects 2019: Highlights, (United Nations, New York, 2019).

2. N. Jones, “Redrawing the map: How the world’s climate zones are shifting” Yale Environment 360 (2018). https://e360.yale.edu/features/redrawing-the-map-how-the-worlds-climate-zones-are-shifting. Accessed 14 May 2020.

3. P. L. Pingali, Green revolution: Impacts, limits, and the path ahead. Proc. Natl. Acad. Sci. U.S.A. 109, 12302–12308 (2012).

4. D. Tilman, The greening of the green revolution. Nature 396, 211–212 (1998).

5. G. P. Ramstein, S. E. Jensen, E. S. Buckler, Breaking the curse of dimensionality to identify causal variants in Breeding 4. Theor. Appl. Genet. 132, 559–567 (2019).

6. D. Gonsalves, Control of papaya ringspot virus in papaya: A case study. Annu. Rev. Phytopathol. 36, 415–437 (1998).

7. N. Kirchgessner et al., The ETH field phenotyping platform FIP: A cable-suspended multi-sensor system. Funct. Plant Biol. 44, 154–168 (2016).

8. K. Yu, N. Kirchgessner, C. Grieder, A. Walter, A. Hund, An image analysis pipeline for automated classification of imaging light conditions and for quantification of wheat canopy cover time series in field phenotyping. Plant Methods 13, 15 (2017).

9. J. Streich et al., Can exascale computing and explainable artificial intelligence applied to plant biology deliver on the United Nations sustainable development goals? Curr. Opin. Biotechnol. 61, 217–225 (2020).

10. A. Walter, R. Finger, R. Huber, N. Buchmann, Opinion: Smart farming is key to developing sustainable agriculture. Proc. Natl. Acad. Sci. U.S.A. 114, 6148–6150 (2017).

The Most Common Pain Relief Drug in The World Induces Risky Behaviour, Study Suggests (Science Alert)

www-sciencealert-com.cdn.ampproject.org

Peter Dockrill

9 September 2020


One of the most consumed drugs in the US – and the most commonly taken analgesic worldwide – could be doing a lot more than simply taking the edge off your headache, new evidence suggests.

Acetaminophen, also known as paracetamol and sold widely under the brand names Tylenol and Panadol, also increases risk-taking, according to a new study that measured changes in people’s behaviour when under the influence of the common over-the-counter medication.

“Acetaminophen seems to make people feel less negative emotion when they consider risky activities – they just don’t feel as scared,” says neuroscientist Baldwin Way from The Ohio State University.

“With nearly 25 percent of the population in the US taking acetaminophen each week, reduced risk perceptions and increased risk-taking could have important effects on society.”

The findings add to a recent body of research suggesting that acetaminophen’s effects on pain reduction also extend to various psychological processes, lowering people’s receptivity to hurt feelings, experiencing reduced empathy, and even blunting cognitive functions.

In a similar way, the new research suggests people’s affective ability to perceive and evaluate risks can be impaired when they take acetaminophen. While the effects might be slight, they’re definitely worth noting, given acetaminophen is the most common drug ingredient in America, found in over 600 different kinds of over-the-counter and prescription medicines.

In a series of experiments involving over 500 university students as participants, Way and his team measured how a single 1,000 mg dose of acetaminophen (the recommended maximum adult single dosage) randomly assigned to participants affected their risk-taking behaviour, compared against placebos randomly given to a control group.

In each of the experiments, participants had to pump up an uninflated balloon on a computer screen, with each single pump earning imaginary money. Their instructions were to earn as much imaginary money as possible by pumping the balloon as much as possible, but to make sure not to pop the balloon, in which case they would lose the money.

The results showed that the students who took acetaminophen engaged in significantly more risk-taking during the exercise, relative to the more cautious and conservative placebo group. On the whole, those on acetaminophen pumped (and burst) their balloons more than the controls.

“If you’re risk-averse, you may pump a few times and then decide to cash out because you don’t want the balloon to burst and lose your money,” Way says.

“But for those who are on acetaminophen, as the balloon gets bigger, we believe they have less anxiety and less negative emotion about how big the balloon is getting and the possibility of it bursting.”

In addition to the balloon simulation, participants also filled out surveys during two of the experiments, rating the level of risk they perceived in various hypothetical scenarios, such as betting a day’s income on a sporting event, bungee jumping off a tall bridge, or driving a car without a seatbelt.

In one of the surveys, acetaminophen consumption did appear to reduce perceived risk compared to the control group, although in another similar survey, the same effect wasn’t observed.

Overall, however, based on an average of results across the various tests, the team concludes that there is a significant relationship between taking acetaminophen and choosing more risk, even if the observed effect can be slight.

That said, they acknowledge the drug’s apparent effects on risk-taking behaviour could also be interpreted via other kinds of psychological processes, such as reduced anxiety, perhaps.

“It may be that as the balloon increases in size, those on placebo feel increasing amounts of anxiety about a potential burst,” the researchers explain.

“When the anxiety becomes too much, they end the trial. Acetaminophen may reduce this anxiety, thus leading to greater risk taking.”

Exploring such psychological alternative explanations for this phenomenon – as well as investigating the biological mechanisms responsible for acetaminophen’s effects on people’s choices in situations like this – should be addressed in future research, the team says.

While they’re at it, scientists no doubt will also have future opportunities to further investigate the role and efficacy of acetaminophen in pain relief more broadly, after studies in recent years found that in many medical scenarios, the drug can be ineffective at pain relief, and sometimes is no better than a placebo, in addition to inviting other kinds of health problems.

Despite the seriousness of those findings, acetaminophen nonetheless remains one of the most used medications in the world, considered an essential medicine by the World Health Organisation, and recommended by the CDC as the primary drug you should probably take to ease symptoms if you think you might have coronavirus.

In light of what we’re finding out about acetaminophen, we might want to rethink some of that advice, Way says.

“Perhaps someone with mild COVID-19 symptoms may not think it is as risky to leave their house and meet with people if they’re taking acetaminophen,” Way says.

“We really need more research on the effects of acetaminophen and other over-the-counter drugs on the choices and risks we take.”

The findings are reported in Social Cognitive and Affective Neuroscience.

A Professor of Disasters and Health on Covid-19 (Nautilus)

Posted By Ilan Kelman on Mar 16, 2020

It is no mystery why pandemics happen. Those with the knowledge, wisdom, and resources must choose to decide to avoid these disasters that afflict everyone.Photograph by Pavel L Photo and Video / Shutterstock

A new virus sweeps the world, closing borders, shutting down arts and sports, and killing thousands of people. Is this coronavirus pandemic, with the disease named Covid-19, simply a natural disaster, a culling of overpopulation as suggested by callous commentators who seem to revel in human misery? Is it nature’s rebuttal to human-caused climate change, forcing us to reduce fossil fuel-based transportation and overconsumption (apart from toilet paper)? The answer is neither. As with almost all disasters, the Covid-19 disaster is the outcome of human choices.

The Earth, with its microorganisms, tectonic activity, powerful weather, and other phenomena, has long posed dangers to humans. We know this, so it is up to us to deal with it. Sometimes we manage and sometimes we do not. Sometimes we are forced into situations with few choices, such as impoverished people living on the slopes of Mexico City’s volcano or in the subsiding floodplains of Jakarta. Not everyone can or should be a planner or engineer, to avoid houses built on soils prone to liquefying in an earthquake or offices lacking basic seismic reinforcement. Sometimes, we need to trust the zoning regulations and building codes—and their monitoring and enforcement—to keep us safe. Too often, gaps are revealed only after people have died, from the collapse of the CTV Building in Christchurch, New Zealand, during the 2011 earthquake, to New Orleans flooding during Hurricane Katrina in 2005. Those who suffer most, from Australia’s 2020 bushfires to Haiti’s 2010 earthquake, tend to have the fewest options for countering their vulnerabilities which were created by others.

We know that, by disturbing ecosystems, we make pandemics beyond Covid-19 more likely to occur.

When we are vulnerable to nature, it is because societal actions set people up to be harmed by nature. As we cannot blame nature for disasters, we should avoid the phrase “natural disaster.” They are just “disasters.” It could be shoddily built infrastructure, breaking or not having planning regulations, not being able to afford or not having insurance, poor communication of warnings, or fearing assault in an evacuation shelter. It is the same with disease. 

The World Health Organization of the United Nations was lambasted for being far too slow to observe and respond to what became the largest Ebola epidemic yet known, in West Africa between 2014 and 2016. In the years before, donor countries to the WHO had slashed the funds available, particularly hitting the division responsible for surveilling, monitoring, preparing for, and responding to possible epidemics. Experienced staff departed, communication lines to health systems around the world slackened, and institutional memory faded. Not that the UN’s organizations are perfect otherwise, displaying their own operational failings alongside geographic and cultural biases. Plus, many of the Ebola-struck countries—for instance, Guinea, Liberia, and Sierra Leone—have long lacked adequate health systems, with the governments mired in corruption, conflict, external exploitation, and incompetence. Deficient local, national, and international governance for epidemics meant that Ebola spread far faster and farther afield than it would have if health systems had been supported. A further illustration comes from infected people ending up in the United Kingdom and the United States, yet neither country experienced an Ebola outbreak nor was there ever a pandemic. When it was decided that the spread of Ebola should be stopped, knowledge, resources, and actions were harnessed to stop the spread of Ebola. Earlier choices in West Africa, especially long-term backing for health systems, would have curtailed the disease far sooner.

And so we come to Covid-19. When a strange form of pneumonia appeared in patients in Wuhan, China in December 2019, medical staff reported it and soon identified the origin in one market. They isolated the new virus and publicly announced its genetic sequence. Authorities gave assurances that transmission between humans was not possible and that the virus was under control, despite evidence that neither was the case. Medical staff in Wuhan noticing the sickness explained that they were not permitted to broadcast their knowledge about it. Ai Fen, an emergency department doctor, was reprimanded and told to keep quiet. An ophthalmologist, Li Wenliang, was intimidated and silenced. He eventually died of coronavirus, with the media adorning him with the poignant label of “whistle blower.”

It is a choice to institute what is now referred to as a “cover up” when a potential public health threat emerges. It is a choice not to listen to health professionals hired in key positions when they are trying to save lives through public health measures. It is a choice to have opaque dissemination procedures and to try to shut down information flow. Now that the pandemic has been created by choices early on, it is a choice that many others are making to panic-buy soap while others are not bothering to wash their hands properly or to stop touching their food or face with unwashed hands. So much of disease is about human behavior. This in no way diminishes the importance of the essential medical responses. Without vaccines, smallpox, polio, rinderpest, measles, mumps, and a whole host of other lethal diseases would continue to run rampant. Along with antibiotics and other pharmaceuticals, vaccines not only save lives daily, but also reduce the costs of running health systems by stopping illness.

Health systems must have technologies and tools—dialysis machines, isolation wards, defibrillators, and stents within the dizzying array—but must not stop at technical means and buildings. Any health system must be underpinned by people, training, and experience—exactly what many of the authorities disdained when people in Wuhan suddenly fell ill. Earlier choices in China might have curtailed the spread of Covid-19 before it morphed into a pandemic. Even basic hygiene when dealing with animals might have prevented the virus from jumping species to humans.

Today, diseases targeted for eradication include rubella, measles, dracunculiasis (Guinea worm disease), and polio. The latter two remain endemic in conflict zones, often reappearing due to war, like polio did in 2013, in Syria, where it had disappeared a decade previously. Similarly, dracunculiasis is close to being eradicated, stubbornly remaining in areas wracked by violence including Chad and South Sudan. Choices to target these diseases are nonetheless preventing epidemics of them, with eradication in sight. London and Paris famously eliminated cholera in the 19th century by building sewage systems, among other actions. Malaria used to be prevalent in southern England and across the US. Dedicated efforts eradicated it and continue to prevent its re-introduction, despite cases from travelers and near international airports. We can continue these efforts by choice or we can let malaria return.

We know that, by disturbing ecosystems, we make pandemics beyond Covid-19 more likely to occur. “In Africa, we see a lot of incursion driven by oil or mineral extraction in areas that typically had few human populations,” Dennis Carroll, an infectious disease researcher, told Nautilus editor Kevin Berger. “The problem is not only moving workers and establishing camps in these domains, but building roads that allow for even more movement of populations. Roads also allow for the movement of wildlife animals, which may be part of a food trade, to make their way into urban settlements. All these dramatic changes increase the potential spread of infection.” It is no mystery why pandemics happen. Those with the knowledge, wisdom, and resources must choose to decide to avoid these disasters that afflict everyone.

Ilan Kelman is Professor of Disasters and Health at University College London and the author of Disaster By Choice: How Our Actions Turn Natural Hazards into Catastrophes. Follow him on Twitter/Instagram @IlanKelman.

Paying for pain: What motivates tough mudders and other weekend warriors? (Science Daily)

Date:
March 22, 2017
Source:
Journal of Consumer Research
Summary:
Why do people pay for experiences deliberately marketed as painful? According to a new study, consumers will pay big money for extraordinary — even painful — experiences to offset the physical malaise resulting from today’s sedentary lifestyles.

Why do people pay for experiences deliberately marketed as painful? According to a new study in the Journal of Consumer Research, consumers will pay big money for extraordinary — even painful — experiences to offset the physical malaise resulting from today’s sedentary lifestyles.

“How do we explain that on the one hand consumers spend billions of dollars every year on analgesics and opioids, while exhausting and painful experiences such as obstacle races and ultra-marathons are gaining in popularity?” asked authors Rebecca Scott (Cardiff University), Julien Cayla (Nanyang Technological University), and Bernard Cova (KEDGE Business School).

Tough Mudder is a grueling adventure challenge involving about 25 military-style obstacles that participants — known as Mudders — must overcome in half a day. Among others, its events entail running through torrents of mud, plunging into freezing water, and crawling through 10,000 volts of electric wires. Injuries have included spinal damage, strokes, heart attacks, and even death.

Through extensive interviews with Mudders, the authors learned that pain helps individuals deal with the reduced physicality of office life. Through sensory intensification, pain brings the body into sharp focus, allowing participants who spend much of their time sitting in front of computers to rediscover their corporeality.

In addition, the authors write, pain facilitates escape and provides temporary relief from the burdens of self-awareness. Electric shocks and exposure to icy waters might be painful, but they also allow participants to escape the demands and anxieties of modern life.

“By leaving marks and wounds, painful experiences help us create the story of a fulfilled life spent exploring the limits of the body,” the authors conclude. “The proliferation of videos recording painful experiences such as Tough Mudder happens at least partly because a fulfilled life also means exploring the body in its various possibilities.”


Journal Reference:

  1. Rebecca Scott, Julien Cayla, Bernard Cova. Selling Pain to the Saturated SelfJournal of Consumer Research, 2017; DOI: 10.1093/jcr/ucw071

Um mapa do risco no mundo (Pesquisa Fapesp)

Com exceção do Japão, os países pobres e em desenvolvimento são os mais vulneráveis a desastres naturais 

MARCOS PIVETTA | ED. 249 | NOVEMBRO 2016

mapa
Por estar sujeito a fortes terremotos e inundações causadas por tsunamis, o Japão é o único país desenvolvido que apresenta risco muito alto de ser afetado por cataclismos, segundo a edição de 2016 do World Risk Report, publicação organizada pela Universidade das Nações Unidas, agência alemã Alliance Development Works e Universidade de Stuttgart. A nação asiática figura na 17ª posição do índice mundial de risco a desastres, que classifica 171 países em função da possibilidade de serem alvo de cinco tipos de eventos extremos: secas, inundações, ciclones ou tempestades, terremotos e aumento do nível do mar.

O índice lista as áreas do globo em ordem decrescente de vulnerabilidade a desastres e os separa em cinco categorias. Cada uma delas é composta por 20% do total de países, que são classificados como sendo de risco muito alto, alto, médio, baixo ou muito baixo. O indicador final é calculado por meio da análise de 28 parâmetros geoclimáticos e socioeconômicos, como a quantidade de pessoas expostas a desastres, a renda e a educação da população, a capacidade de mitigar o impacto de eventos extremos e de se adaptar a mudanças.

Vanuatu, um pequeno arquipélago do Pacífico sul distante 1.700 quilômetros a leste da Austrália, com 250 mil habitantes, é o país mais arriscado do mundo, o número 1 do índice. Está sujeito a terremotos, ciclones e pode ser coberto pelas águas se o nível do mar aumentar. Isso sem contar o vulcanismo, que não entra no cálculo do índice. O segundo lugar é ocupado por Tonga, um arquipélago da Polinésia, e o terceiro, pelas Filipinas. O Haiti, onde o furacão Matthew matou 1.300 pessoas e desalojou 35 mil em outubro, aparece em 21º lugar da lista. O Brasil ocupa a 123ª posição e está classificado na categoria dos países de baixo risco, como os Estados Unidos, a Itália, a Argentina e o Reino Unido. “Nenhum índice baseado em desastres naturais é perfeito”, comenta Lucí Hidalgo Nunes, da Unicamp. “De acordo com as variáveis usadas e o peso dado a elas, as classificações mudam. Mas, certamente, o Brasil não é um dos países em pior situação.”

Nasa aims to move Earth (The Guardian)

Scientists’ answer to global warming: nudge the planet farther from Sun

Special report: global warming

, science editor

Sunday 10 June 2001 Last modified on Friday 1 January 2016 


Scientists have found an unusual way to prevent our planet overheating: move it to a cooler spot.

All you have to do is hurtle a few comets at Earth, and its orbit will be altered. Our world will then be sent spinning into a safer, colder part of the solar system.

This startling idea of improving our interplanetary neighbourhood is the brainchild of a group of Nasa engineers and American astronomers who say their plan could add another six billion years to the useful lifetime of our planet – effectively doubling its working life.

‘The technology is not at all far-fetched,’ said Dr Greg Laughlin, of the Nasa Ames Research Center in California. ‘It involves the same techniques that people now suggest could be used to deflect asteroids or comets heading towards Earth. We don’t need raw power to move Earth, we just require delicacy of planning and manoeuvring.’

The plan put forward by Dr Laughlin, and his colleagues Don Korycansky and Fred Adams, involves carefully directing a comet or asteroid so that it sweeps close past our planet and transfers some of its gravitational energy to Earth.

‘Earth’s orbital speed would increase as a result and we would move to a higher orbit away from the Sun,’ Laughlin said.

Engineers would then direct their comet so that it passed close to Jupiter or Saturn, where the reverse process would occur. It would pick up energy from one of these giant planets. Later its orbit would bring it back to Earth, and the process would be repeated.

In the short term, the plan provides an ideal solution to global warming, although the team was actually concerned with a more drastic danger. The sun is destined to heat up in about a billion years and so ‘seriously compromise’ our biosphere – by frying us.

Hence the group’s decision to try to save Earth. ‘All you have to do is strap a chemical rocket to an asteroid or comet and fire it at just the right time,’ added Laughlin. ‘It is basic rocket science.’

The plan has one or two worrying aspects, however. For a start, space engineers would have to be very careful about how they directed their asteroid or comet towards Earth. The slightest miscalculation in orbit could fire it straight at Earth – with devastating consequences.

There is also the vexed question of the Moon. As the current issue of Scientific American points out, if Earth was pushed out of its current position it is ‘most likely the Moon would be stripped away from Earth,’ it states, radically upsetting out planet’s climate.

These criticisms are accepted by the scientists. ‘Our investigation has shown just how delicately Earth is poised within the solar system,’ Laughlin admitted. ‘Nevertheless, our work has practical implications. Our calculations show that to get Earth to a safer, distant orbit, it would have to pass through unstable zones and would need careful nurturing and nudging. Any alien astronomers observing our solar system would know that something odd had occurred, and would realise an intelligent lifeform was responsible.

‘And the same goes for us. When we look at other solar systems, and detect planets around other suns – which we are now beginning to do – we may see that planet-moving has occurred. It will give us our first evidence of the handiwork of extraterrestrial beings.’

Uma década de avanços em biotecnologia (Folha de S.Paulo)

11 de fevereiro de 2016

Lei de Biossegurança completa 10 anos dialogando com as mais recentes descobertas da ciência

Walter Colli – Instituto de Química, Universidade de São Paulo

Ao longo de 2015, uma silenciosa revolução biotecnológica aconteceu no Brasil. Neste ano a Comissão Técnica Nacional de Biossegurança (CTNBio) analisou e aprovou um número recorde de tecnologias aplicáveis à agricultura, medicina e produção de energia. O trabalho criterioso dos membros da CTNBio avaliou como seguros para a saúde humana e animal e para o ambiente 19 novos transgênicos, dentre os quais 13 plantas, três vacinas e três microrganismos ou derivados.

A CTNBio, priorizando o rigor nas análises de biossegurança e atenta às necessidades de produzir alimentos de maneira mais sustentável aprovou, no ano passado, variedades de soja, milho e algodão tolerantes a herbicidas com diferentes métodos de ação. Isso permitirá que as sementes desenvolvam todo seu potencial e que os produtores brasileiros tenham mais uma opção para a rotação de tecnologias no manejo de plantas daninhas. Sem essa ferramenta tecnológica, os agricultores ficariam reféns das limitações impostas pelas plantas invasoras. As tecnologias de resistência a insetos proporcionam benefícios semelhantes.

Na área da saúde, a revolução diz respeito aos métodos de combate a doenças que são endêmicas das regiões tropicais. Mais uma vez, mostrando-se parceira da sociedade, a CTNBio avaliou a biossegurança de duas vacinas recombinantes contra a Dengue em regime de urgência e deu parecer favorável a elas. Soma-se a estes esforços a aprovação do Aedes aegypti transgênico. O mosquito geneticamente modificado aprovado em 2014 tem se mostrado um aliado no combate ao inseto que, além de ser vetor da dengue, também está associado a casos de transmissão dos vírus Zika, Chikungunya e da febre amarela.

Nos últimos 10 anos, até o momento, o advento da nova CTNBio pela Lei 11.105 de 2005 – a Lei de Biossegurança – proporcionou a aprovação comercial de 82 Organismos Geneticamente Modificados (OGM): 52 eventos em plantas; 20 vacinas veterinárias; 7 microrganismos; 1 mosquito Aedes aegypti; e 2 vacinas para uso humano contra a Dengue. Essas liberações comerciais são a maior prova de que o Brasil lança mão da inovação para encontrar soluções para os desafios da contemporaneidade.

Entretanto, é necessário enfatizar que assuntos não relacionados com Ciência também se colocaram, como em anos anteriores, no caminho do desenvolvimento da biotecnologia em 2015. Manifestantes anti-ciência invadiram laboratórios e destruíram sete anos de pesquisas com plantas transgênicas de eucalipto e grupos anti-OGM chegaram a interromper reuniões da CTNBio, pondo abaixo portas com ações truculentas. Diversas inverdades foram publicadas na tentativa de colocar em dúvida a segurança e as contribuições que a transgenia vem dando para a sociedade. A ação desses grupos preocupa, pois, se sua ideologia for vitoriosa, tanto o progresso científico quanto o PIB brasileiros ficarão irreversivelmente prejudicados.

Hoje, a nossa Lei de Biossegurança é tida internacionalmente como um modelo de equilíbrio entre o rigor nas análises técnicas e a previsibilidade institucional necessária para haver o investimento. O reconhecimento global, o diálogo com a sociedade e a legitimidade dos critérios técnicos mostram que esses 10 anos são apenas o início de uma longa história de desenvolvimento e inovação no Brasil.

Barragens de alto risco ameaçam 540 mil pessoas (O Globo)

por Mariana Sanches

Barragem da Imerys na cidade de Barcarena, no Pará: vazamento de cerca de 450 mil metros cúbicos de rejeitos de caulim já aconteceu em 2007 – Arquivo/“O Liberal”

SÃO PAULO – A análise de documentos do Departamento Nacional de Produção Mineral (DNPM), órgão responsável pela fiscalização de barragens de mineração em todo o Brasil, revela que a tragédia que atingiu Mariana (MG) pode se repetir em pelo menos 16 outras barragens de quatro estados do país. O drama que matou 11 pessoas, desapareceu com outras 12 e atravessou Minas Gerais e Espírito Santo em direção ao mar ameaça mais meio milhão de pessoas. O Cadastro Nacional de Barragens de Mineração de abril de 2014 mostra que 16 reservatórios e uma cava de garimpo possuem categoria de risco alto — quando a estrutura não oferece condições ideais de segurança e pode colapsar — e alto dano potencial associado — quando pode afetar e matar populações, contaminar rios, destruir biomas e causar graves danos socioeconômicos.

De acordo com cálculos feitos pelo GLOBO, se essas barragens rompessem, os rejeitos potencialmente atingiriam 14 municípios, cuja população soma 540 mil habitantes. Incluindo-se na conta a cava de Serra Pelada, no Pará, são 780 mil pessoas em risco. As unidades possuem volume de 84 milhões de metros cúbicos para abrigar o material descartado no processo de mineração de ferro, estanho, manganês, caulim e ouro. O montante é 50% maior que a quantidade de lama que vazou da Samarco, que pertence à Vale e à australiana BHP.

Os rejeitos ameaçam três das maiores bacias hidrográficas brasileiras: a do Rio Paraguai, no coração do Pantanal sul-matogrossense; a do Rio Amazonas, que irriga a floresta amazônica; e a do Rio São Francisco, que banha o Nordeste.

EMPRESAS NÃO FORNECEM DOCUMENTOS

A estimativa foi feita a partir da localização das barragens, dos cursos d’água e da localização da jusante — o sentido da vazão dos rios. Foram considerados municípios em risco imediato aqueles que estão a menos de 50 quilômetros das barragens e no caminho da correnteza de igarapés, riachos e rios que banham a área.

Apenas para comparação, a lama que saiu de Mariana já percorreu cerca de dez vezes a distância de 50 quilômetros usada na estimativa e partiu do reservatório a uma velocidade de cerca de 70 km/h. Repetidas as condições da barragem de Fundão, vilarejos desses municípios seriam afetados em menos de uma hora.

Os dados usados são do DNPM e do Instituto Brasileiro de Geografia e Estatística (IBGE). Nenhuma das empresas responsáveis pelas barragens de alto risco forneceu laudos técnicos sobre o que aconteceria com seus rejeitos se as estruturas colapsassem, o que permitiria traçar uma rota mais certeira do impacto nos municípios e até dos atingidos indiretamente, por falta d’água, por exemplo. Esses estudos compõem os Planos de Ações Emergenciais de Barragens de Mineração, que incluem também a lista de procedimentos para salvamento de pessoas e contenção de desastres em caso de emergência, cuja formulação é obrigatória por lei.

— Não há porque as empresas não tornarem esses documentos públicos, é uma informação importante para a população. O comportamento é estranho e preocupante. Sugere que o plano possa não existir ou que tenha sido feito de qualquer maneira — alertou o geólogo Álvaro dos Santos, do Instituto de Pesquisas Tecnológicas.

O plano de contingência da Samarco só foi apresentado mais de uma semana após o incidente e criticado pelo Ministério Público de Minas Gerais. O documento não previa alerta sonoro nem treinamento de pessoas que moravam na área de risco.

Entre as barragens listadas como potencialmente perigosas, há empresas reincidentes em acidentes. Uma delas, a Imerys Rio Capim Caulim S/A, é responsável pelo vazamento de cerca de 450 mil metros cúbicos de rejeitos de caulim — mistura de água e barro esbranquiçado — de uma das bacias, em 2007. Os rejeitos atingiram igarapés e rios do município de Barcarena (PA). Em 2014, o Ministério Público Federal investigou pelo menos outros dois vazamentos dos tanques da companhia. Agora, a empresa aparece como controladora de três barragens de classificação A: alto risco quanto à conservação e alto dano potencial. Ainda assim, sua produção não foi reduzida nem paralisada.

O Brasil está entre os dez maiores produtores mundiais de caulim, minério fundamental para a produção de papel. A Imerys afirmou, em nota, que não paralisou as atividades porque a lei não obriga, e negou que as estruturas estejam fora de controle. “Entre 2013 e 2015 foram investidos cerca de R$ 15 milhões na segurança de operações de barragem”, disse a nota, que ressaltou ainda que sistematicamente são tomadas “medidas como monitoramento do nível das bacias, acompanhamento do nível dos lençóis freáticos e estudos de estabilidade dos maciços das bacias”. A empresa reconheceu que “onde está a planta de beneficiamento da Imerys, existem pessoas” e disse ter plano de emergência voltada para elas, mas não apresentou documentos nem detalhes.

— É óbvio que as atividades deveriam ser suspensas nesses casos, mas a fiscalização não obriga. Aliás, não há nem prazo para que a empresa melhore suas estruturas, ela pode fazer quando quiser — diz a procuradora Zani Cajueiro, especialista no assunto.

Em Corumbá (MS), a Vale controla a Urucum Mineração, dona de dois reservatórios de classificação A, usados na extração de manganês. Esse tipo de atividade costuma produzir como rejeito quantidades de arsênio, substância altamente tóxica, de acordo com o Centro de Tecnologia Mineral (Cetem), do Ministério da Tecnologia. A Vale negou que o rejeito seja perigoso e disse que manteve as operações a despeito do resultado negativo das condições das estruturas. Afirmou ainda que inspeções feitas em 2015 reenquadraram as bacias para baixo e médio risco, mas não apresentou documentos que comprovem isso.

Já a Gerdau AçoMinas, controladora da Barragem Bocaina em Ouro Preto (MG), disse que, em análise do fim de 2014, o reservatório foi considerado de baixo risco e que está fora de operação. Apresentou um documento do DNPM que mostra a mudança de classificação para nível C. No entanto, a página não tem data.

Dona de bacias de água barrenta encravadas no meio da floresta amazônica, a Taboca Mineração é a empresa com maior número de barragens na lista: são dez, usadas para mineração de estanho. A empresa admitiu que, em caso de rompimento, a maior delas poderia provocar uma onda de cinco metros de rejeitos, que atingiria áreas indígenas. Afirmou que nas bacias há água e areia de granito. As estruturas não estão em operação e passam por recuperação ambiental. A Taboca afirmou que adota criteriosos padrões de segurança, “inclusive com mais rigor que o exigido pela legislação”.

Especialistas, no entanto, questionam as condições das barragens, mesmo daquelas que não estão em situação de alto risco. Em Mariana, a barragem rompida era considerada de baixo risco.

— Quem produz os laudos são as próprias empresas ou consultorias contratadas por elas. A raposa cuida do galinheiro — disse Francisco Fernandes, pesquisador do Cetem.

O DNPM não respondeu à reportagem.

Áreas de alto e muito alto risco (IPT)

Instituto de Pesquisas Tecnológicas assina contrato com Defesa Civil de SP para identificação de áreas de deslizamentos e inundações em 10 cidades

Um novo contrato assinado entre o Instituto de Pesquisas Tecnológicas (IPT) e a Coordenaria Estadual de Defesa Civil do Estado de São Paulo prevê o mapeamento das áreas de alto e muito alto risco a deslizamentos e inundações em 10 municípios. A relação contempla os municípios que apresentaram incidência e recorrência de eventos de ordem meteorológica, hidrológica e geológica, de acordo com dados estatísticos registrados no Sistema Integrado de Defesa Civil (SIDEC) e que ainda não possuem instrumentos de identificação de risco abrangidos no Plano Preventivo de Defesa Civil do estado.

O projeto de três meses será realizado pela Seção de Investigações, Riscos e Desastres Naturais do IPT por meio de visitas técnicas aos municípios e posterior organização das informações em mapas, imagens e documentação fotográfica em um Sistema de Informações Geográficas (SIG), a fim de subsidiar o gerenciamento das áreas e estabelecer parâmetros técnicos e sociais.

Os graus de risco considerados seguem o método desenvolvido em 2007 pelo Ministério das Cidades e IPT, o qual estabelece quatro condições potenciais de risco – “é importante ressaltar que o projeto tratará dos setores classificados como de risco alto (R3) e muito alto (R4) das 10 cidades”, afirma Marcelo Fischer Gramani, coordenador do projeto e pesquisador da Seção.

As cidades incluídas no projeto estão localizadas nas Regiões Administrativas de Presidente Prudente (Adamantina, Caiabu, Inúbia Paulista e Presidente Prudente), de Campinas (Caconde e Divinolândia), de Itapeva (Paranapanema e Tejupá), de Marília (Tupã) e de Barretos (Olímpia). Estes municípios foram indicados como prioritários por não terem informações atualizadas sobre riscos de deslizamento e/ou inundações.

As principais atividades desenvolvidas pelo IPT incluem a pesquisa bibliográfica dos levantamentos de áreas de riscos existentes, a consulta às equipes das Coordenadorias Municipais de Defesa Civil sobre o número de atendimentos efetuados nos locais que serão avaliados, a realização de vistorias de campo para levantamento de indicadores de risco e tipologias dos processos, e a elaboração de documentação fotográfica.

Os dados coletados serão analisados para fundamentar o relatório técnico, que irá conter informações como descrição da área avaliada, delimitação dos setores de risco identificados em imagem de sensores remotos, quantidade de imóveis em risco, quantidade de pessoas em risco, tipologia do processo (deslizamento, inundação, solapamento de margem) e sugestões de intervenções para minimizar ou eliminar os riscos identificados.

(IPT)

‘Não podemos brincar de Deus com as alterações no genoma humano’, alerta ONU (ONU)

Publicado em Atualizado em 07/10/2015

A modificação do código genético permite tratar doenças como o câncer, mas pode gerar mudanças hereditárias. UNESCO pede uma regulamentação clara sobre os procedimentos científicos e informação à população.

Foto: Flickr/ ynse

“Terapia genética poderia ser o divisor de águas na história da medicina e a alteração no genoma é sem dúvida um dos maiores empreendimentos da ciência em nome da humanidade”, afirmou a Organização das Nações Unidas para a Educação, a Ciência e a Cultura (UNESCO) sobre um relatório publicado pelo Comitê Internacional de Bioética (IBC) nesta segunda-feira (5).

O IBC acrescentou, no entanto, que intervenções no genoma humano deveriam ser autorizadas somente em casos preventivos, diagnósticos ou terapêuticos que não gerem alterações para os descendentes. O relatório destaca também a importância da regulamentação e informação clara aos consumidores.

O documento ressaltou os avanços na possibilidade de testes genéticos em casos de doenças hereditárias, por meio da terapia genética, o uso de células tronco embrionárias na pesquisa médica e uso de clones e alterações genéticas para fins medicinais. São citadas também novas técnicas que podem inserir, tirar e corrigir o DNA, podendo tratar ou curar o câncer e outras doenças. Porém, estas mesmas técnicas também possibilitam mudanças no DNA, como determinar a cor dos olhos de um bebê, por exemplo.

“O grande medo é que podemos estar tentando “brincar de Deus” com consequências imprevisíveis” e no final precipitando a nossa própria destruição”, alertou o antigo secretário-geral da ONU, Kofi Annan em 2004, quando perguntado qual seria a linha ética que determinaria o limite das alterações no genoma humano. Para responder a essa questão, os Estados-membros da UNESCO adotaram em 2005 a Declaração Universal sobre Bioética e Direitos Humanos que lida com os dilemas éticos levantados pelas rápidas mudanças na medicina, na ciência e tecnologia.

A região mexicana que acredita ser protegida por ETs (BBC)

15 abril 2015

BBC Mundo

Muitos moradores de Tampico e Ciudad Madero acreditam que a costa em frente à praia Miramar é o melhor local para se avistar ETs

Sentado num sofá de uma cafeteria simples de Ciudad Madero, um homem me convida a meditar para ver óvnis.

A televisão exibe Bob Marley cantando I Shot the Sheriff e, atrás do balcão, uma mulher prepara um frappuccino.

A cidade fica no violento Estado de Tamaulipas, nordeste do México, e muitos acreditam que os extraterrestres passaram décadas a protegendo de furacões.

Isto porque, quando os furacões que ocorrem na região avançam com força até a costa, onde fica a cidade, eles parariam de forma abrupta e misteriosa, mudando de direção, de acordo com os habitantes mais crentes.

Moradores dizem que já viram os alienígenas, outros afirmam que há uma base submarina a cerca de 40 quilômetros da costa e que já viram suas naves, esferas, triângulos e luzes.

Thinkstock

Aliens são um assunto falado abertamente nesta região do México

E todos conversam abertamente sobre o assunto.

O engenheiro civil Fernando Alonso Gallardo, 68 anos, aposentado da petroleira estatal Pemex e empresário, tem o rosto queimado pelo sol da praia local, Miramar, uma faixa de areia de dez quilômetros.

Pelas janelas do restaurante de Gallardo, o El Mexicano, que fica na praia, entra uma brisa do Golfo do México.

Gallardo conta sua história à BBC Mundo, o serviço em espanhol da BBC. A dele, como a de muitos em Ciudad Madero, envolve avistamentos de objetos voadores não identificados.

BBC Mundo

Furacões em 1933 e 1955 destruíram o restaurante da família de Alonso

Em 1933, quando os furacões ainda não tinham nome, um da categoria 5 chegou a Tampico, onde Gallardo nasceu, perto de Ciudad Madero. O furacão destruiu o restaurante de seu pai, mas a família construiu outro.

Em 1955 o furacão Hilda, que inundou três quartos da cidade e deixou 20 mil desabrigados, voltou a atingir a região.

“Acho que nesta época não havia extraterrestres, se houvesse, não teria tantos desastres”, diz Gallardo.

Furacões também ocorreram em 1947, 1951 e 1966. Mas, logo, as tempestades pararam de atingir a região.

Investigadores acreditam que o verdadeiro motivo do desvio dos furacões é a presença de correntes de água fria na área. Mas, nas vizinhas Tampico e Ciudad Madero, ninguém ignora a crença de que algo sobrenatural defenderia a região.

Avistamentos

Entre o século 19 e os anos 1970, quando as pessoas viam objetos luminosos no céu, diziam que eram bruxas.

Em 1967, foi construído um monumento à Virgem de Carmen – padroeira do mar e dos marinheiros – no local por onde passam pescadores quando deixam o rio Pánuco, que divide os Estados de Tamaulipas e Veracruz.

Muitos viam aí a explicação para o desaparecimento de furacões.

Até hoje, é uma tradição que marinheiros façam o sinal da cruz diante da estátua e capitães buzinem suas embarcações, disse Marco Flores, que desde 1995 é cronista oficial do governo da cidade de Tampico.

A teoria marciana chegou pouco depois.

BBC Mundo

Muitos acreditam que são os ETs que protegem a região de furacões

Segundo Flores, ela foi trazida por um homem da Cidade do México que chegou a Tampico por volta dos anos 1970 a trabalho, e garantiu que mais do que proteger a cidade, os extraterrestres que haviam entrado em contato com ele guardavam suas bases submarinas.

Alonso Gallardo concorda. “Não é um esforço para proteger a cidade, é um esforço para proteger a cidade onde eles vivem, porque eles encontraram uma maneira de estar lá”.

Gallardo diz ter visto seu primeiro óvni em 1983: um disco de 60 metros de diâmetro com luzes amareladas. Isso ocorreu no final do calçadão que serve para separar a água verde do Golfo do México da água escura do rio Pánuco.

Ali, dizem os que acreditam, é o melhor lugar para se ver os objetos.

‘Falta de inteligência’

O ponto de encontro dos “crentes” era um café no Walmart, mas a mulher que os atendia não parecia confortável com o tópico da conversa. Assim, os membros da Associação de Investigação Científica Óvni de Tampico se mudaram para o restaurante Bambino de Ciudad Madero.

Ali, cada um espera para narrar suas experiências.

BBC Mundo

José Luis Cárdenas tira fotos do céu, nas quais aparecem luzes estranhas

Na cabeceira da mesa, Eduardo Ortiz Anguiano, 83 anos, fala sobre seu livro publicado no ano passado, De Ovnis, fantasmas e outros eventos extraordinários.

Durante três anos, ele coletou mais de 100 depoimentos e se convenceu: “Duvidar da existência de óvnis é não ter inteligência”.

E muitos concordam. Eva Martínez diz que a presença de extraterrestres lhe dá paz.

José Luis Cárdenas tem várias fotografias nas quais se vê luzes com formas estranhas – luzes que não estão no céu no momento da foto mas que aparecem no visor da câmera, segundo ele.

“Se os seres que nos visitam não nos machucam, então estão nos protegendo, estão fazendo algo por nós. E é assim que temos que ver as coisas”, disse.

A última vez que um furacão que dirigia-se para a área de Tampico se desviou foi em 2013.

Naquele ano, autoridades locais colocaram o busto de um marciano na praia de Miramar (que foi roubado logo depois) e declararam que na última terça-feira de outubro seria celebrado o Dia do Marciano.

“A explicação que não podemos dar cientificamente damos de maneira mágica. As pessoas desta região têm um pensamento mágico”, diz Flores, o cronista de Tampico.

‘Deus gosta de Tampico’

No sofá da cafeteria de Ciudad Madero, Juan Carlos Ramón López Díaz, presidente da associação de pesquisadores de óvnis, pede para que eu feche os olhos e mantenha a mente tranquila.

Ele me convida a ver um objeto luminoso no qual posso entrar, se eu quiser.

Atrás do balcão, ligam o liquidificador. Abro os olhos. Apesar da ajuda de López Díaz, não vi nada.

Sabesp inicia obras às pressas sem avaliar risco (OESP)

Fabio Leite – O Estado de S. Paulo

15 Março 2015 | 02h 01

Companhia de Saneamento Básico do Estado de São Paulo desengavetou planos sem ter tempo de estudar impacto ambiental

SÃO PAULO – A busca por novos mananciais para suprir a escassez hídrica a curto prazo e tentar evitar o rodízio oficial de água na Grande São Paulo levou a Companhia de Saneamento Básico do Estado de São Paulo (Sabesp) a tirar do papel uma série de projetos engavetados há anos e a executá-los a toque de caixa sem Estudo de Impacto Ambiental (EIA), aprovação em comitês ou decreto de estado de emergência.

Até o momento, são seis obras (uma já concluída) que envolvem transposições entre rios e reservatórios com o objetivo de aumentar a oferta de água para conseguir abastecer 20 milhões de pessoas durante o período seco (que vai de abril a setembro) sem decretar racionamento generalizado. A principal delas é a interligação do Sistema Rio Grande com o Alto Tietê, o segundo manancial mais crítico (21% da capacidade), melhor só que o Cantareira.

Segundo a Sabesp, já foi iniciada a construção de 11 quilômetros de adutora e uma estação de bombeamento para levar até 4 mil litros por segundo da Billings, no ABC, para a Represa Taiaçupeba, em Suzano. A conclusão está prevista para julho. Técnicos do governo Geraldo Alckmin (PSDB) afirmam, contudo, que uma obra desse porte precisaria de EIA, aprovação no Comitê da Bacia do Alto Tietê, além da outorga do Departamento de Águas e Energia Elétrica de São Paulo (DAEE).

A principal das obras é a interligação do Sistema Rio Grande com o Alto Tietê, o segundo manancial mais crítico (21% da capacidade), melhor só que o Cantareira.

A principal das obras é a interligação do Sistema Rio Grande com o Alto Tietê, o segundo manancial mais crítico (21% da capacidade), melhor só que o Cantareira.

Com a provável reversão das águas do poluído corpo central da Billings para o Braço Rio Grande, já manifestada pela Sabesp, seria preciso ainda aprovação prévia do Conselho Estadual do Meio Ambiente (Consema) e de outorga da Agência Nacional de Energia Elétrica (Aneel), já que a represa também fornece água para geração de energia na Usina Henry Borden, em Cubatão. Todo esse trâmite teve de ser seguido para a execução da ligação Billings-Guarapiranga, pelo Braço Taquacetuba, na crise de 2000.

“Ou o governo decreta estado de emergência para tocar as chamadas obras emergenciais sem licitação e estudo de impacto ambiental, com perda de capacidade de concorrência e de participação social, ou então licita e produz os relatórios necessários. Do jeito que está, há uma incoerência brutal”, afirmou o engenheiro Darcy Brega Filho, especialista em gestão de sustentabilidade e ex-funcionário da Sabesp.

Mar. No pacote de obras emergenciais estão a interligação de dois rios de vertente marítima (que deságuam no mar), Itatinga e Capivari, para rios que são afluentes das Represas Jundiaí (Alto Tietê) e Guarapiranga. As duas intervenções recém-anunciadas pela Sabesp já constavam do Plano Diretor de Águas e Abastecimento (PDAA) de 2004 e ficaram engavetadas. Cada uma deve aumentar a vazão dos sistemas em 1 mil litros por segundo e também precisariam de aprovação do Comitê da Bacia da Baixada Santista.

“Sem dúvida, é preciso de obras emergenciais para trazer água para a região metropolitana, mas isso não anula uma avaliação mais acurada desse conjunto de transposições para calcular a eficiência desses projetos e seus efeitos indiretos”, afirmou o especialista em recursos hídricos José Galizia Tundisi, presidente do Instituto Internacional de Ecologia e vice-presidente do Instituto Acqua.

Um exemplo citado por funcionários do governo sobre a falta de avaliação dos projetos é a construção de 9 quilômetros de adutora para levar 1 mil litros por segundo do Rio Guaió para a Represa Taiaçupeba. As obras começaram em fevereiro e devem ser concluídas em maio, segundo a Sabesp. Técnicos da área afirmam que durante o período de estiagem a vazão média desse rio é de apenas 300 litros por segundo, ou seja, 70% menor do que a pretendida.

Vamos defender a água (Conta d’Água)

24 fev 2015

Você vem tomando banho de gato para economizar água? Não descarrega a privada se ela estiver apenas com xixi? Usa a água da lavadora de roupas para limpar o quintal? Sua casa está cheia de caixas d’água e baldes para armazenar chuva?

Oi! Estamos falando com você porque estamos na mesma situação.

O governador Geraldo Alckmin e a Sabesp — que vivem no reino da fantasia — dizem que não há racionamento, que não há falta de água na cidade.

Mas –na vida real– ou falta água todo dia, ou falta durante muitos dias seguidos, como já vem acontecendo na zona leste da capital.

Agora, o governador e a Sabesp dizem que os mananciais estão se recuperando com as chuvas de verão.

Eles querem nos tranquilizar porque têm medo do povo na rua.

A verdade é que os reservatórios de água, as represas e os rios que abastecem a região metropolitana de São Paulo estão nos níveis mais baixos da história.

As chuvas que têm desabado sobre a cidade são como uns caraminguás entrando numa conta que já está estourada no cheque especial. Sim, porque explorar o volume morto do sistema Cantareira (como ainda está acontecendo) é como entrar no cheque especial: fácil entrar, difícil sair.

Quando começar a estiagem, a partir de abril, aí é que a coisa vai ficar feia:

Seca climática sem reserva de água é o mesmo que aumento de doenças, fechamento de fábricas, comércio e escolas, desemprego.

Em uma palavra: sofrimento.

O pior de tudo é que enquanto nós fazemos uma economia danada e enfrentamos a interrupção no fornecimento de água, a Sabesp premiou 500 empresas privilegiadas com o direito de receber todo santo dia milhões de litros de água potável — e elas pagam uma tarifa camarada, bem mais baixa do que a dos cidadãos comuns.

É justo isso?

A reponsabilidade por tanto desmando é do governo do Estado, que não fez os investimentos necessários para reduzir os vazamentos nos canos de água da rede de abastecimento; que privatizou parte da Sabesp e distribuiu gordas fatias dos lucros para acionistas na bolsa de valores de Nova York; que preferiu culpar São Pedro a tomar providências; que presenteia com agrados os amigos da empresa.

E eles ainda querem aumentar a tarifa da água em abril!

Porque não queremos mais ser enganados; porque a população exige a elaboração de um plano de emergência para lidar com a seca; porque não queremos pagar nem um centavo a mais pela água que a Sabesp não entrega, porque não aceitamos privilégios no acesso à água, vamos fazer um grande ato público nesta quinta-feira (26 de fevereiro).

A iniciativa, do Movimento dos Trabalhadores Sem Teto (MTST), já conseguiu a adesão de vários movimentos sociais e de ambientalistas. A concentração será às 17h no Largo da Batata, em Pinheiros. De lá sairemos em passeata para o Palácio dos Bandeirantes, mansão onde vive o governador Geraldo Alckmin.

Vamos dizer bem alto para ele que não aceitamos pagar o pato pela crise que não criamos;

Que exigimos água boa, limpa e cristalina para todos (e não só para os mais ricos e privilegiados);

Chega de irresponsabilidade com a vida da população!

Sabesp admite que rodízio pode contaminar água (Estadão)

Pedro Venceslau e Fabio Leite – O Estado de S. Paulo

26 Fevereiro 2015 | 03h 00

Diretor disse em CPI que problema não colocaria usuário em risco; empresa também afirmou que pressão está fora da norma

SÃO PAULO – O risco de contaminação da água admitido nesta quarta-feira, 25, pelo diretor metropolitano da Companhia de Saneamento Básico do Estado de São Paulo (Sabesp), Paulo Massato, em caso de rodízio oficial já é realidade em algumas regiões altas da Grande São Paulo. São locais onde a rede fica despressurizada após o fechamento manual dos registros na rua, conforme um alto dirigente da empresa admitiu ao Estado no início do mês.

“Se implementado o rodízio, a rede fica despressurizada, principalmente em regiões de topografia acidentada, nos pontos em que a tubulação está em declive. Se o lençol freático está contaminado, isso aumenta o risco de contaminação (da água na rede)”, afirmou Massato, nesta quarta, durante sessão da CPI da Sabesp na Câmara Municipal.

O resultado desse contágio, segundo ele, não colocaria a vida dos consumidores em risco, mas poderia causar disenteria, por exemplo. “Nós temos hoje medicina suficiente para minimizar risco de vida para a população. Uma disenteria pode ser mais grave ou menos grave, mas é um risco (implementar o rodízio) que nós queremos evitar ”, completou. Apesar do alerta, ele disse que a estatal poderia “descontaminar” rapidamente a água afetada.

Hélvio Romero/Estadão

‘Estamos em uma situação de anormalidade. Nós não conseguiríamos abastecer 6 milhões de habitantes se mantivéssemos a normalidade’, disse Massato

No início do mês, um dirigente da Sabesp admitiu ao Estado que em 40% da rede onde não há válvulas redutoras de pressão (VRPs) instaladas, o racionamento de água é feito por meio do fechamento manual, flagrado pela reportagem na Vila Brasilândia, zona norte da capital. Segundo ele, a manobra “não esvazia totalmente” a rede, mas “despressuriza pontos mais altos”.

“A zona baixa fica com água. Se não houver consumo excessivo, a maior parte da rede fica com água. Acaba despressurizando zonas altas, isso acontece mesmo. Tanto é que quando abre (o registro) para encher de novo, as zonas mais altas e distantes acabam sofrendo mais, ficando mais tempo sem água”, afirmou.

Para o engenheiro Antonio Giansante, professor de Engenharia Hídrica do Mackenzie, é grande o risco de contaminação em caso de fechamento da rede. “Em uma eventualidade de o tubo estar seco, pode ser que entre água de qualidade não controlada, em geral, contaminada por causa das redes coletoras de esgoto, para dentro da rede da Sabesp.”

Segundo interlocutores do governador Geraldo Alckmin (PSDB), a declaração desagradou o tucano, uma vez que o rodízio não está descartado. Massato já havia causado constrangimento ao governo ao dizer, em 27 de janeiro, que São Paulo poderia ficar até cinco dias sem água por semana em caso de racionamento.

Fora da norma. Massato e o presidente da Sabesp, Jerson Kelman, que também prestou depoimento à CPI, admitiram aos vereadores que a empresa mantém a pressão da água na rede abaixo do recomendado pela Associação Brasileira de Normas Técnicas (ABNT), conforme o Estado revelou no início do mês. Segundo o órgão, são necessários ao menos 10 metros de coluna de água para encher todas as caixas.

“Nós estamos garantindo 1 metro da coluna de água, preservando a rede de distribuição. Mas não tem pressão suficiente para chegar na caixa d’água”, admitiu Massato. “Estamos abaixo dos 10 metros de coluna de água, principalmente nas zonas mais altas e mais distantes dos reservatórios.”

“Essa é uma medida mitigadora para evitar algo muito pior para a população, que é o rodízio”, afirmou Kelman. “São poucos pontos na rede em que não se tem a pressão exigida pela ABNT para condições normais. Isso não é uma opção da Sabesp. Não estamos em condições normais”, completou.

Em dezembro, Alckmin disse que a Sabesp cumpria “rigorosamente” a norma técnica. A Sabesp foi notificada pela Agência Reguladora de Saneamento e Energia do Estado de São Paulo (Arsesp) e respondeu na terça-feira aos questionamentos feitos sobre as manobras na rede. O órgão fiscalizador, contudo, ainda não se pronunciou.

Ar encanado. Questionados sobre a investigação do Ministério Público Estadual que apura suposta cobrança por “ar encanado” pela Sabesp, revelada pelo Estado, os dirigentes da empresa disseram que a prática atingiu apenas 2% dos clientes. Das 22 mil reclamações registradas em fevereiro sobre aumento indevido da conta, 500 culpavam o ar encanado. O problema ocorre quando a água retorna na rede e empurra o ar de volta para as ligações das casas, podendo adulterar a medição do hidrômetro. / COLABOROU RICARDO CHAPOLA

Panel Urges Research on Geoengineering as a Tool Against Climate Change (New York Times)

Piles at a CCI Energy Solutions coal handling plant in Shelbiana, Ky. Geoengineering proposals might counteract the effects of climate change that are the result of burning fossils fuels, such as coal. Credit: Luke Sharrett/Getty Images 

With the planet facing potentially severe impacts from global warming in coming decades, a government-sponsored scientific panel on Tuesday called for more research on geoengineering — technologies to deliberately intervene in nature to counter climate change.

The panel said the research could include small-scale outdoor experiments, which many scientists say are necessary to better understand whether and how geoengineering would work.

Some environmental groups and others say that such projects could have unintended damaging effects, and could set society on an unstoppable path to full-scale deployment of the technologies.

But the National Academy of Sciences panel said that with proper governance, which it said needed to be developed, and other safeguards, such experiments should pose no significant risk.

In two widely anticipated reports, the panel — which was supported by NASA and other federal agencies, including what the reports described as the “U.S. intelligence community” — noted that drastically reducing emissions of carbon dioxide and other greenhouse gases was by far the best way to mitigate the effects of a warming planet.

A device being developed by a company called Global Thermostat, is made to capture carbon dioxide from the air. This may be one solution to counteract climate change.CreditHenry Fountain/The New York Times 

But the panel, in making the case for more research into geoengineering, said, “It may be prudent to examine additional options for limiting the risks from climate change.”

“The committee felt that the need for information at this point outweighs the need for shoving this topic under the rug,” Marcia K. McNutt, chairwoman of the panel and the editor in chief of the journal Science, said at a news conference in Washington.

Geoengineering options generally fall into two categories: capturing and storing some of the carbon dioxide that has already been emitted so that the atmosphere traps less heat, or reflecting more sunlight away from the earth so there is less heat to start with. The panel issued separate reports on each.

The panel said that while the first option, called carbon dioxide removal, was relatively low risk, it was expensive, and that even if it was pursued on a planetwide scale, it would take many decades to have a significant impact on the climate. But the group said research was needed to develop efficient and effective methods to both remove the gas and store it so it remains out of the atmosphere indefinitely.

The second option, called solar radiation management, is far more controversial. Most discussions of the concept focus on the idea of dispersing sulfates or other chemicals high in the atmosphere, where they would reflect sunlight, in some ways mimicking the effect of a large volcanic eruption.

The process would be relatively inexpensive and should quickly lower temperatures, but it would have to be repeated indefinitely and would do nothing about another carbon dioxide-related problem: the acidification of oceans.

This approach might also have unintended effects on weather patterns around the world — bringing drought to once-fertile regions, for example. Or it might be used unilaterally as a weapon by governments or even extremely wealthy individuals.

Opponents of geoengineering have long argued that even conducting research on the subject presents a moral hazard that could distract society from the necessary task of reducing the emissions that are causing warming in the first place.

“A geoengineering ‘technofix’ would take us in the wrong direction,” Lisa Archer, food and technology program director of the environmental group Friends of the Earth, said in a statement. “Real climate justice requires dealing with root causes of climate change, not launching risky, unproven and unjust schemes.”

But the panel said that society had “reached a point where the severity of the potential risks from climate change appears to outweigh the potential risks from the moral hazard” of conducting research.

Ken Caldeira, a geoengineering researcher at the Carnegie Institution for Science and a member of the committee, said that while the panel felt that it was premature to deploy any sunlight-reflecting technologies today, “it’s worth knowing more about them,” including any problems that might make them unworkable.

“If there’s a real showstopper, we should know about it now,” Dr. Caldeira said, rather than discovering it later when society might be facing a climate emergency and desperate for a solution.

Dr. Caldeira is part of a small community of scientists who have researched solar radiation management concepts. Almost all of the research has been done on computers, simulating the effects of the technique on the climate. One attempt in Britain in 2011 to conduct an outdoor test of some of the engineering concepts provoked a public outcry. The experiment was eventually canceled.

David Keith, a researcher at Harvard University who reviewed the reports before they were released, said in an interview, “I think it’s terrific that they made a stronger call than I expected for research, including field research.” Along with other researchers, Dr. Keith has proposed a field experiment to test the effect of sulfate chemicals on atmospheric ozone.

Unlike some European countries, the United States has never had a separate geoengineering research program. Dr. Caldeira said establishing a separate program was unlikely, especially given the dysfunction in Congress. But he said that because many geoengineering research proposals might also help in general understanding of the climate, agencies that fund climate research might start to look favorably upon them.

Dr. Keith agreed, adding that he hoped the new reports would “break the logjam” and “give program managers the confidence they need to begin funding.”

At the news conference, Waleed Abdalati, a member of the panel and a professor at the University of Colorado, said that geoengineering research would have to be subject to governance that took into account not just the science, “but the human ramifications, as well.”

Dr. Abdalati said that, in general, the governance needed to precede the research. “A framework that addresses what kinds of activities would require governance is a necessary first step,” he said.

Raymond Pierrehumbert, a geophysicist at the University of Chicago and a member of the panel, said in an interview that while he thought that a research program that allowed outdoor experiments was potentially dangerous, “the report allows for enough flexibility in the process to follow that it could be decided that we shouldn’t have a program that goes beyond modeling.”

Above all, he said, “it’s really necessary to have some kind of discussion among broader stakeholders, including the public, to set guidelines for an allowable zone for experimentation.”