Arquivo da tag: Incerteza

The Point of No Return: Climate Change Nightmares Are Already Here (Rolling Stone)

The worst predicted impacts of climate change are starting to happen — and much faster than climate scientists expected

BY  August 5, 2015

Walruses

Walruses, like these in Alaska, are being forced ashore in record numbers. Corey Accardo/NOAA/AP 

Historians may look to 2015 as the year when shit really started hitting the fan. Some snapshots: In just the past few months, record-setting heat waves in Pakistan and India each killed more than 1,000 people. In Washington state’s Olympic National Park, the rainforest caught fire for the first time in living memory. London reached 98 degrees Fahrenheit during the hottest July day ever recorded in the U.K.; The Guardian briefly had to pause its live blog of the heat wave because its computer servers overheated. In California, suffering from its worst drought in a millennium, a 50-acre brush fire swelled seventyfold in a matter of hours, jumping across the I-15 freeway during rush-hour traffic. Then, a few days later, the region was pounded by intense, virtually unheard-of summer rains. Puerto Rico is under its strictest water rationing in history as a monster El Niño forms in the tropical Pacific Ocean, shifting weather patterns worldwide.

On July 20th, James Hansen, the former NASA climatologist who brought climate change to the public’s attention in the summer of 1988, issued a bombshell: He and a team of climate scientists had identified a newly important feedback mechanism off the coast of Antarctica that suggests mean sea levels could rise 10 times faster than previously predicted: 10 feet by 2065. The authors included this chilling warning: If emissions aren’t cut, “We conclude that multi-meter sea-level rise would become practically unavoidable. Social disruption and economic consequences of such large sea-level rise could be devastating. It is not difficult to imagine that conflicts arising from forced migrations and economic collapse might make the planet ungovernable, threatening the fabric of civilization.”

Eric Rignot, a climate scientist at NASA and the University of California-Irvine and a co-author on Hansen’s study, said their new research doesn’t necessarily change the worst-case scenario on sea-level rise, it just makes it much more pressing to think about and discuss, especially among world leaders. In particular, says Rignot, the new research shows a two-degree Celsius rise in global temperature — the previously agreed upon “safe” level of climate change — “would be a catastrophe for sea-level rise.”

Hansen’s new study also shows how complicated and unpredictable climate change can be. Even as global ocean temperatures rise to their highest levels in recorded history, some parts of the ocean, near where ice is melting exceptionally fast, are actually cooling, slowing ocean circulation currents and sending weather patterns into a frenzy. Sure enough, a persistently cold patch of ocean is starting to show up just south of Greenland, exactly where previous experimental predictions of a sudden surge of freshwater from melting ice expected it to be. Michael Mann, another prominent climate scientist, recently said of the unexpectedly sudden Atlantic slowdown, “This is yet another example of where observations suggest that climate model predictions may be too conservative when it comes to the pace at which certain aspects of climate change are proceeding.”

Since storm systems and jet streams in the United States and Europe partially draw their energy from the difference in ocean temperatures, the implication of one patch of ocean cooling while the rest of the ocean warms is profound. Storms will get stronger, and sea-level rise will accelerate. Scientists like Hansen only expect extreme weather to get worse in the years to come, though Mann said it was still “unclear” whether recent severe winters on the East Coast are connected to the phenomenon.

And yet, these aren’t even the most disturbing changes happening to the Earth’s biosphere that climate scientists are discovering this year. For that, you have to look not at the rising sea levels but to what is actually happening within the oceans themselves.

Water temperatures this year in the North Pacific have never been this high for this long over such a large area — and it is already having a profound effect on marine life.

Eighty-year-old Roger Thomas runs whale-watching trips out of San Francisco. On an excursion earlier this year, Thomas spotted 25 humpbacks and three blue whales. During a survey on July 4th, federal officials spotted 115 whales in a single hour near the Farallon Islands — enough to issue a boating warning. Humpbacks are occasionally seen offshore in California, but rarely so close to the coast or in such numbers. Why are they coming so close to shore? Exceptionally warm water has concentrated the krill and anchovies they feed on into a narrow band of relatively cool coastal water. The whales are having a heyday. “It’s unbelievable,” Thomas told a local paper. “Whales are all over
the place.”

Last fall, in northern Alaska, in the same part of the Arctic where Shell is planning to drill for oil, federal scientists discovered 35,000 walruses congregating on a single beach. It was the largest-ever documented “haul out” of walruses, and a sign that sea ice, their favored habitat, is becoming harder and harder to find.

Marine life is moving north, adapting in real time to the warming ocean. Great white sharks have been sighted breeding near Monterey Bay, California, the farthest north that’s ever been known to occur. A blue marlin was caught last summer near Catalina Island — 1,000 miles north of its typical range. Across California, there have been sightings of non-native animals moving north, such as Mexican red crabs.

Salmon

Salmon on the brink of dying out. Michael Quinton/Newscom

No species may be as uniquely endangered as the one most associated with the Pacific Northwest, the salmon. Every two weeks, Bill Peterson, an oceanographer and senior scientist at the National Oceanic and Atmospheric Administration’s Northwest Fisheries Science Center in Oregon, takes to the sea to collect data he uses to forecast the return of salmon. What he’s been seeing this year is deeply troubling.

Salmon are crucial to their coastal ecosystem like perhaps few other species on the planet. A significant portion of the nitrogen in West Coast forests has been traced back to salmon, which can travel hundreds of miles upstream to lay their eggs. The largest trees on Earth simply wouldn’t exist without salmon.

But their situation is precarious. This year, officials in California are bringing salmon downstream in convoys of trucks, because river levels are too low and the temperatures too warm for them to have a reasonable chance of surviving. One species, the winter-run Chinook salmon, is at a particularly increased risk of decline in the next few years, should the warm water persist offshore.

“You talk to fishermen, and they all say: ‘We’ve never seen anything like this before,’ ” says Peterson. “So when you have no experience with something like this, it gets like, ‘What the hell’s going on?’ ”

Atmospheric scientists increasingly believe that the exceptionally warm waters over the past months are the early indications of a phase shift in the Pacific Decadal Oscillation, a cyclical warming of the North Pacific that happens a few times each century. Positive phases of the PDO have been known to last for 15 to 20 years, during which global warming can increase at double the rate as during negative phases of the PDO. It also makes big El Niños, like this year’s, more likely. The nature of PDO phase shifts is unpredictable — climate scientists simply haven’t yet figured out precisely what’s behind them and why they happen when they do. It’s not a permanent change — the ocean’s temperature will likely drop from these record highs, at least temporarily, some time over the next few years — but the impact on marine species will be lasting, and scientists have pointed to the PDO as a global-warming preview.

“The climate [change] models predict this gentle, slow increase in temperature,” says Peterson, “but the main problem we’ve had for the last few years is the variability is so high. As scientists, we can’t keep up with it, and neither can the animals.” Peterson likens it to a boxer getting pummeled round after round: “At some point, you knock them down, and the fight is over.”

India

Pavement-melting heat waves in India. Harish Tyagi/EPA/Corbis

Attendant with this weird wildlife behavior is a stunning drop in the number of plankton — the basis of the ocean’s food chain. In July, another major study concluded that acidifying oceans are likely to have a “quite traumatic” impact on plankton diversity, with some species dying out while others flourish. As the oceans absorb carbon dioxide from the atmosphere, it’s converted into carbonic acid — and the pH of seawater declines. According to lead author Stephanie Dutkiewicz of MIT, that trend means “the whole food chain is going to be different.”

The Hansen study may have gotten more attention, but the Dutkiewicz study, and others like it, could have even more dire implications for our future. The rapid changes Dutkiewicz and her colleagues are observing have shocked some of their fellow scientists into thinking that yes, actually, we’re heading toward the worst-case scenario. Unlike a prediction of massive sea-level rise just decades away, the warming and acidifying oceans represent a problem that seems to have kick-started a mass extinction on the same time scale.

Jacquelyn Gill is a paleoecologist at the University of Maine. She knows a lot about extinction, and her work is more relevant than ever. Essentially, she’s trying to save the species that are alive right now by learning more about what killed off the ones that aren’t. The ancient data she studies shows “really compelling evidence that there can be events of abrupt climate change that can happen well within human life spans. We’re talking less than a decade.”

For the past year or two, a persistent change in winds over the North Pacific has given rise to what meteorologists and oceanographers are calling “the blob” — a highly anomalous patch of warm water between Hawaii, Alaska and Baja California that’s thrown the marine ecosystem into a tailspin. Amid warmer temperatures, plankton numbers have plummeted, and the myriad species that depend on them have migrated or seen their own numbers dwindle.

Significant northward surges of warm water have happened before, even frequently. El Niño, for example, does this on a predictable basis. But what’s happening this year appears to be something new. Some climate scientists think that the wind shift is linked to the rapid decline in Arctic sea ice over the past few years, which separate research has shown makes weather patterns more likely to get stuck.

A similar shift in the behavior of the jet stream has also contributed to the California drought and severe polar vortex winters in the Northeast over the past two years. An amplified jet-stream pattern has produced an unusual doldrum off the West Coast that’s persisted for most of the past 18 months. Daniel Swain, a Stanford University meteorologist, has called it the “Ridiculously Resilient Ridge” — weather patterns just aren’t supposed to last this long.

What’s increasingly uncontroversial among scientists is that in many ecosystems, the impacts of the current off-the-charts temperatures in the North Pacific will linger for years, or longer. The largest ocean on Earth, the Pacific is exhibiting cyclical variability to greater extremes than other ocean basins. While the North Pacific is currently the most dramatic area of change in the world’s oceans, it’s not alone: Globally, 2014 was a record-setting year for ocean temperatures, and 2015 is on pace to beat it soundly, boosted by the El Niño in the Pacific. Six percent of the world’s reefs could disappear before the end of the decade, perhaps permanently, thanks to warming waters.

Since warmer oceans expand in volume, it’s also leading to a surge in sea-level rise. One recent study showed a slowdown in Atlantic Ocean currents, perhaps linked to glacial melt from Greenland, that caused a four-inch rise in sea levels along the Northeast coast in just two years, from 2009 to 2010. To be sure, it seems like this sudden and unpredicted surge was only temporary, but scientists who studied the surge estimated it to be a 1-in-850-year event, and it’s been blamed on accelerated beach erosion “almost as significant as some hurricane events.”

Turkey

Biblical floods in Turkey. Ali Atmaca/Anadolu Agency/Getty

Possibly worse than rising ocean temperatures is the acidification of the waters. Acidification has a direct effect on mollusks and other marine animals with hard outer bodies: A striking study last year showed that, along the West Coast, the shells of tiny snails are already dissolving, with as-yet-unknown consequences on the ecosystem. One of the study’s authors, Nina Bednaršek, told Science magazine that the snails’ shells, pitted by the acidifying ocean, resembled “cauliflower” or “sandpaper.” A similarly striking study by more than a dozen of the world’s top ocean scientists this July said that the current pace of increasing carbon emissions would force an “effectively irreversible” change on ocean ecosystems during this century. In as little as a decade, the study suggested, chemical changes will rise significantly above background levels in nearly half of the world’s oceans.

“I used to think it was kind of hard to make things in the ocean go extinct,” James Barry of the Monterey Bay Aquarium Research Institute in California told the Seattle Times in 2013. “But this change we’re seeing is happening so fast it’s almost instantaneous.”

Thanks to the pressure we’re putting on the planet’s ecosystem — warming, acidification and good old-fashioned pollution — the oceans are set up for several decades of rapid change. Here’s what could happen next.

The combination of excessive nutrients from agricultural runoff, abnormal wind patterns and the warming oceans is already creating seasonal dead zones in coastal regions when algae blooms suck up most of the available oxygen. The appearance of low-oxygen regions has doubled in frequency every 10 years since 1960 and should continue to grow over the coming decades at an even greater rate.

So far, dead zones have remained mostly close to the coasts, but in the 21st century, deep-ocean dead zones could become common. These low-oxygen regions could gradually expand in size — potentially thousands of miles across — which would force fish, whales, pretty much everything upward. If this were to occur, large sections of the temperate deep oceans would suffer should the oxygen-free layer grow so pronounced that it stratifies, pushing surface ocean warming into overdrive and hindering upwelling of cooler, nutrient-rich deeper water.

Enhanced evaporation from the warmer oceans will create heavier downpours, perhaps destabilizing the root systems of forests, and accelerated runoff will pour more excess nutrients into coastal areas, further enhancing dead zones. In the past year, downpours have broken records in Long Island, Phoenix, Detroit, Baltimore, Houston and Pensacola, Florida.

Evidence for the above scenario comes in large part from our best understanding of what happened 250 million years ago, during the “Great Dying,” when more than 90 percent of all oceanic species perished after a pulse of carbon dioxide and methane from land-based sources began a period of profound climate change. The conditions that triggered “Great Dying” took hundreds of thousands of years to develop. But humans have been emitting carbon dioxide at a much quicker rate, so the current mass extinction only took 100 years or so to kick-start.

With all these stressors working against it, a hypoxic feedback loop could wind up destroying some of the oceans’ most species-rich ecosystems within our lifetime. A recent study by Sarah Moffitt of the University of California-Davis said it could take the ocean thousands of years to recover. “Looking forward for my kid, people in the future are not going to have the same ocean that I have today,” Moffitt said.

As you might expect, having tickets to the front row of a global environmental catastrophe is taking an increasingly emotional toll on scientists, and in some cases pushing them toward advocacy. Of the two dozen or so scientists I interviewed for this piece, virtually all drifted into apocalyptic language at some point.

For Simone Alin, an oceanographer focusing on ocean acidification at NOAA’s Pacific Marine Environmental Laboratory in Seattle, the changes she’s seeing hit close to home. The Puget Sound is a natural laboratory for the coming decades of rapid change because its waters are naturally more acidified than most of the world’s marine ecosystems.

The local oyster industry here is already seeing serious impacts from acidifying waters and is going to great lengths to avoid a total collapse. Alin calls oysters, which are non-native, the canary in the coal mine for the Puget Sound: “A canary is also not native to a coal mine, but that doesn’t mean it’s not a good indicator of change.”

Though she works on fundamental oceanic changes every day, the Dutkiewicz study on the impending large-scale changes to plankton caught her off-guard: “This was alarming to me because if the basis of the food web changes, then . . . everything could change, right?”

Alin’s frank discussion of the looming oceanic apocalypse is perhaps a product of studying unfathomable change every day. But four years ago, the birth of her twins “heightened the whole issue,” she says. “I was worried enough about these problems before having kids that I maybe wondered whether it was a good idea. Now, it just makes me feel crushed.”

Katharine Hayhoe

Katharine Hayhoe speaks about climate change to students and faculty at Wayland Baptist University in 2011. Geoffrey McAllister/Chicago Tribune/MCT/Getty

Katharine Hayhoe, a climate scientist and evangelical Christian, moved from Canada to Texas with her husband, a pastor, precisely because of its vulnerability to climate change. There, she engages with the evangelical community on science — almost as a missionary would. But she’s already planning her exit strategy: “If we continue on our current pathway, Canada will be home for us long term. But the majority of people don’t have an exit strategy. . . . So that’s who I’m here trying to help.”

James Hansen, the dean of climate scientists, retired from NASA in 2013 to become a climate activist. But for all the gloom of the report he just put his name to, Hansen is actually somewhat hopeful. That’s because he knows that climate change has a straightforward solution: End fossil-fuel use as quickly as possible. If tomorrow, the leaders of the United States and China would agree to a sufficiently strong, coordinated carbon tax that’s also applied to imports, the rest of the world would have no choice but to sign up. This idea has already been pitched to Congress several times, with tepid bipartisan support. Even though a carbon tax is probably a long shot, for Hansen, even the slim possibility that bold action like this might happen is enough for him to devote the rest of his life to working to achieve it. On a conference call with reporters in July, Hansen said a potential joint U.S.-China carbon tax is more important than whatever happens at the United Nations climate talks in Paris.

One group Hansen is helping is Our Children’s Trust, a legal advocacy organization that’s filed a number of novel challenges on behalf of minors under the idea that climate change is a violation of intergenerational equity — children, the group argues, are lawfully entitled to inherit a healthy planet.

A separate challenge to U.S. law is being brought by a former EPA scientist arguing that carbon dioxide isn’t just a pollutant (which, under the Clean Air Act, can dissipate on its own), it’s also a toxic substance. In general, these substances have exceptionally long life spans in the environment, cause an unreasonable risk, and therefore require remediation. In this case, remediation may involve planting vast numbers of trees or restoring wetlands to bury excess carbon underground.

Even if these novel challenges succeed, it will take years before a bend in the curve is noticeable. But maybe that’s enough. When all feels lost, saving a few species will feel like a triumph.

From The Archives Issue 1241: August 13, 2015

Read more: http://www.rollingstone.com/politics/news/the-point-of-no-return-climate-change-nightmares-are-already-here-20150805#ixzz3iRVjFBme
Follow us: @rollingstone on Twitter | RollingStone on Facebook

Climate Seer James Hansen Issues His Direst Forecast Yet (The Daily Beast) + other sources, and repercussions

A polar bear walks in the snow near the Hudson Bay waiting for the bay to freeze, 13 November 2007, outside Churchill, Mantioba, Canada. Polar bears return to Churchill, the polar bear capital of the world, to hunt for seals on the icepack every year at this time and remain on the icepack feeding on seals until the spring thaw.   AFP PHOTO/Paul J. Richards (Photo credit should read PAUL J. RICHARDS/AFP/Getty Images)

Paul J Richards/AFP/Getty

Mark Hertsgaard 

07.20.151:00 AM ET

James Hansen’s new study explodes conventional goals of climate diplomacy and warns of 10 feet of sea level rise before 2100. The good news is, we can fix it.

James Hansen, the former NASA scientist whose congressional testimony put global warming on the world’s agenda a quarter-century ago, is now warning that humanity could confront “sea level rise of several meters” before the end of the century unless greenhouse gas emissions are slashed much faster than currently contemplated.This roughly 10 feet of sea level rise—well beyond previous estimates—would render coastal cities such as New York, London, and Shanghai uninhabitable.  “Parts of [our coastal cities] would still be sticking above the water,” Hansen says, “but you couldn’t live there.”

James Hanson

Columbia University

This apocalyptic scenario illustrates why the goal of limiting temperature rise to 2 degrees Celsius is not the safe “guardrail” most politicians and media coverage imply it is, argue Hansen and 16 colleagues in a blockbuster study they are publishing this week in the peer-reviewed journal Atmospheric Chemistry and Physics. On the contrary, a 2 C future would be “highly dangerous.”

If Hansen is right—and he has been right, sooner, about the big issues in climate science longer than anyone—the implications are vast and profound.

Physically, Hansen’s findings mean that Earth’s ice is melting and its seas are rising much faster than expected. Other scientists have offered less extreme findings; the United Nations Intergovernmental Panel on Climate Change (IPCC) has projected closer to 3 feet of sea level rise by the end of the century, an amount experts say will be difficult enough to cope with. (Three feet of sea level rise would put runways of all three New York City-area airports underwater unless protective barriers were erected. The same holds for airports in the San Francisco Bay Area.)

Worldwide, approximately $3 trillion worth infrastructure vital to civilization such as water treatment plants, power stations, and highways are located at or below 3 feet of sea level, according to the Stern Review, a comprehensive analysis published by the British government.

Hansen’s track record commands respect. From the time the soft-spoken Iowan told the U.S. Senate in 1988 that man-made global warming was no longer a theory but had in fact begun and threatened unparalleled disaster, he has consistently been ahead of the scientific curve.

Hansen has long suspected that computer models underestimated how sensitive Earth’s ice sheets were to rising temperatures. Indeed, the IPCC excluded ice sheet melt altogether from its calculations of sea level rise. For their study, Hansen and his colleagues combined ancient paleo-climate data with new satellite readings and an improved model of the climate system to demonstrate that ice sheets can melt at a “non-linear” rate: rather than an incremental melting as Earth’s poles inexorably warm, ice sheets might melt at exponential rates, shedding dangerous amounts of mass in a matter of decades, not millennia. In fact, current observations indicate that some ice sheets already are melting this rapidly.

“Prior to this paper I suspected that to be the case,” Hansen told The Daily Beast. “Now we have evidence to make that statement based on much more than suspicion.”

The Nature Climate Change study and Hansen’s new paper give credence to the many developing nations and climate justice advocates who have called for more ambitious action.

Politically, Hansen’s new projections amount to a huge headache for diplomats, activists, and anyone else hoping that a much-anticipated global climate summit the United Nations is convening in Paris in December will put the world on a safe path. President Barack Obama and other world leaders must now reckon with the possibility that the 2 degrees goal they affirmed at the Copenhagen summit in 2009 is actually a recipe for catastrophe. In effect, Hansen’s study explodes what has long been the goal of conventional climate diplomacy.

More troubling, honoring even the conventional 2 degrees C target has so far proven extremely challenging on political and economic grounds. Current emission trajectories put the world on track towards a staggering 4 degrees of warming before the end of the century, an amount almost certainly beyond civilization’s coping capacity. In preparation for the Paris summit, governments have begun announcing commitments to reduce emissions, but to date these commitments are falling well short of satisfying the 2 degrees goal. Now, factor in the possibility that even 2 degrees is too much and many negotiators may be tempted to throw up their hands in despair.

They shouldn’t. New climate science brings good news as well as bad.  Humanity can limit temperature rise to 1.5 degrees C if it so chooses, according to a little-noticed study by experts at the Potsdam Institute for Climate Impacts (now perhaps the world’s foremost climate research center) and the International Institute for Applied Systems Analysis published in Nature Climate Change in May.

“Actions for returning global warming to below 1.5 degrees Celsius by 2100 are in many ways similar to those limiting warming to below 2 degrees Celsius,” said Joeri Rogelj, a lead author of the study. “However … emission reductions need to scale up swiftly in the next decades.” And there’s a significant catch: Even this relatively optimistic study concludes that it’s too late to prevent global temperature rising by 2 degrees C. But this overshoot of the 2 C target can be made temporary, the study argues; the total increase can be brought back down to 1.5 C later in the century.

Besides the faster emissions reductions Rogelj referenced, two additional tools are essential, the study outlines. Energy efficiency—shifting to less wasteful lighting, appliances, vehicles, building materials and the like—is already the cheapest, fastest way to reduce emissions. Improved efficiency has made great progress in recent years but will have to accelerate, especially in emerging economies such as China and India.

Also necessary will be breakthroughs in so-called “carbon negative” technologies. Call it the photosynthesis option: because plants inhale carbon dioxide and store it in their roots, stems, and leaves, one can remove carbon from the atmosphere by growing trees, planting cover crops, burying charred plant materials underground, and other kindred methods. In effect, carbon negative technologies can turn back the clock on global warming, making the aforementioned descent from the 2 C overshoot to the 1.5 C goal later in this century theoretically possible. Carbon-negative technologies thus far remain unproven at the scale needed, however; more research and deployment is required, according to the study.

Together, the Nature Climate Change study and Hansen’s new paper give credence to the many developing nations and climate justice advocates who have called for more ambitious action. The authors of the Nature Climate Changestudy point out that the 1.5 degrees goal “is supported by more than 100 countries worldwide, including those most vulnerable to climate change.” In May, the governments of 20 of those countries, including the Philippines, Costa Rica, Kenya, and Bangladesh, declared the 2 degrees target “inadequate” and called for governments to “reconsider” it in Paris.

Hansen too is confident that the world “could actually come in well under 2 degrees, if we make the price of fossil fuels honest.”

That means making the market price of gasoline and other products derived from fossil fuels reflect the enormous costs that burning those fuels currently externalizes onto society as a whole. Economists from left to right have advocated achieving this by putting a rising fee or tax on fossil fuels. This would give businesses, governments, and other consumers an incentive to shift to non-carbon fuels such as solar, wind, nuclear, and, best of all, increased energy efficiency. (The cheapest and cleanest fuel is the fuel you don’t burn in the first place.)

But putting a fee on fossil fuels will raise their price to consumers, threatening individual budgets and broader economic prospects, as opponents will surely point out. Nevertheless, higher prices for carbon-based fuels need not have injurious economic effects if the fees driving those higher prices are returned to the public to spend as it wishes. It’s been done that way for years with great success in Alaska, where all residents receive an annual check in compensation for the impact the Alaskan oil pipeline has on the state.

“Tax Pollution, Pay People” is the bumper sticker summary coined by activists at the Citizens Climate Lobby. Legislation to this effect has been introduced in both houses of the U.S. Congress.

Meanwhile, there are also a host of other reasons to believe it’s not too late to preserve a livable climate for young people and future generations.

The transition away from fossil fuels has begun and is gaining speed and legitimacy. In 2014, global greenhouse gas emissions remained flat even as the world economy grew—a first. There has been a spectacular boom in wind and solar energy, including in developing countries, as their prices plummet. These technologies now qualify as a “disruptive” economic force that promises further breakthroughs, said Achim Steiner, executive director of the UN Environment Programme.

Coal, the most carbon-intensive conventional fossil fuel, is in a death spiral, partly thanks to another piece of encouraging news: the historic climate agreement the U.S. and China reached last November, which envisions both nations slashing coal consumption (as China is already doing). Hammering another nail into coal’s coffin, the leaders of Great Britain’s three main political parties pledged to phase out coal, no matter who won the general elections last May.

“If you look at the long-term [for coal], it’s not getting any better,” said Standard & Poor’s Aneesh Prabhu when S&P downgraded coal company bonds to junk status. “It’s a secular decline,” not a mere cyclical downturn.

Last but not least, a vibrant mass movement has arisen to fight climate change, most visibly manifested when hundreds of thousands of people thronged the streets of New York City last September, demanding action from global leaders gathered at the UN. The rally was impressive enough that it led oil and gas giant ExxonMobil to increase its internal estimate of how likely the U.S. government is to take strong action. “That many people marching is clearly going to put pressure on government to do something,” an ExxonMobil spokesman told Bloomberg Businessweek.

The climate challenge has long amounted to a race between the imperatives of science and the contingencies of politics. With Hansen’s paper, the science has gotten harsher, even as the Nature Climate Change study affirms that humanity can still choose life, if it will. The question now is how the politics will respond—now, at Paris in December, and beyond.

Mark Hertsgaard has reported on politics, culture, and the environment from more than 20 countries and written six books, including “HOT: Living Through the Next Fifty Years on Earth.”

*   *   *

Experts make dire prediction about sea levels (CBS)

VIDEO

In the future, there could be major flooding along every coast. So says a new study that warns the world’s seas are rising.

Ever-warming oceans that are melting polar ice could raise sea levels 15 feet in the next 50 to 100 years, NASA’s former climate chief now says. That’s five times higher than previous predictions.

“This is the biggest threat the planet faces,” said James Hansen, the co-author of the new journal article raising that alarm scenario.

“If we get sea level rise of several meters, all coastal cities become dysfunctional,” he said. “The implications of this are just incalculable.”

If ocean levels rise just 10 feet, areas like Miami, Boston, Seattle and New York City would face flooding.

The melting ice would cool ocean surfaces at the poles even more. While the overall climate continues to warm. The temperature difference would fuel even more volatile weather.

“As the atmosphere gets warmer and there’s more water vapor, that’s going to drive stronger thunderstorms, stronger hurricanes, stronger tornadoes, because they all get their energy from the water vapor,” said Hansen.

Nearly a decade ago, Hansen told “60 Minutes” we had 10 years to get global warming under control, or we would reach “tipping point.”

“It will be a situation that is out of our control,” he said. “We’re essentially at the edge of that. That’s why this year is a critical year.”

Critical because of a United Nations meeting in Paris that is designed to reach legally binding agreements on carbons emissions, those greenhouse gases that create global warming.

*   *   *

Sea Levels Could Rise Much Faster than Thought (Climate Denial Crock of the Week)

with Peter SinclairJuly 21, 2015

Washington Post:

James Hansen has often been out ahead of his scientific colleagues.

With his 1988 congressional testimony, the then-NASA scientist is credited with putting the global warming issue on the map by saying that a warming trend had already begun. “It is time to stop waffling so much and say that the evidence is pretty strong that the greenhouse effect is here,” Hansen famously testified.

Now Hansen — who retired in 2013 from his NASA post, and is currently an adjunct professor at Columbia University’s Earth Institute — is publishing what he says may be his most important paper. Along with 16 other researchers — including leading experts on the Greenland and Antarctic ice sheets — he has authored a lengthy study outlining an scenario of potentially rapid sea level rise combined with more intense storm systems.

It’s an alarming picture of where the planet could be headed — and hard to ignore, given its author. But it may also meet with considerable skepticism in the broader scientific community, given that its scenarios of sea level rise occur more rapidly than those ratified by the United Nations’ Intergovernmental Panel on Climate Change in its latest assessment of the state of climate science, published in 2013.

In the new study, Hansen and his colleagues suggest that the “doubling time” for ice loss from West Antarctica — the time period over which the amount of loss could double — could be as short as 10 years. In other words, a non-linear process could be at work, triggering major sea level rise in a time frame of 50 to 200 years. By contrast, Hansen and colleagues note, the IPCC assumed more of a linear process, suggesting only around 1 meter of sea level rise, at most, by 2100.

Here, a clip from our extended interview with Eric Rignot in December of 2014.  Rignot is one of the co-authors of the new study.

Slate:

The study—written by James Hansen, NASA’s former lead climate scientist, and 16 co-authors, many of whom are considered among the top in their fields—concludes that glaciers in Greenland and Antarctica will melt 10 times faster than previous consensus estimates, resulting in sea level rise of at least 10 feet in as little as 50 years. The study, which has not yet been peer reviewed, brings new importance to a feedback loop in the ocean near Antarctica that results in cooler freshwater from melting glaciers forcing warmer, saltier water underneath the ice sheets, speeding up the melting rate. Hansen, who is known for being alarmist and also right, acknowledges that his study implies change far beyond previous consensus estimates. In a conference call with reporters, he said he hoped the new findings would be “substantially more persuasive than anything previously published.” I certainly find them to be.

We conclude that continued high emissions will make multi-meter sea level rise practically unavoidable and likely to occur this century. Social disruption and economic consequences of such large sea level rise could be devastating. It is not difficult to imagine that conflicts arising from forced migrations and economic collapse might make the planet ungovernable, threatening the fabric of civilization.

The science of ice melt rates is advancing so fast, scientists have generally been reluctant to put a number to what is essentially an unpredictable, non-linear response of ice sheets to a steadily warming ocean. With Hansen’s new study, that changes in a dramatic way. One of the study’s co-authors is Eric Rignot, whose own study last year found that glacial melt from West Antarctica now appears to be “unstoppable.” Chris Mooney, writing for Mother Jonescalled that study a “holy shit” moment for the climate.

Daily Beast:

New climate science brings good news as well as bad.  Humanity can limit temperature rise to 1.5 degrees C if it so chooses, according to a little-noticed study by experts at the Potsdam Institute for Climate Impacts (now perhaps the world’s foremost climate research center) and the International Institute for Applied Systems Analysis published in Nature Climate Changein May.

shanghai500

“Actions for returning global warming to below 1.5 degrees Celsius by 2100 are in many ways similar to those limiting warming to below 2 degrees Celsius,” said Joeri Rogelj, a lead author of the study. “However … emission reductions need to scale up swiftly in the next decades.” And there’s a significant catch: Even this relatively optimistic study concludes that it’s too late to prevent global temperature rising by 2 degrees C. But this overshoot of the 2 C target can be made temporary, the study argues; the total increase can be brought back down to 1.5 C later in the century.

Besides the faster emissions reductions Rogelj referenced, two additional tools are essential, the study outlines. Energy efficiency—shifting to less wasteful lighting, appliances, vehicles, building materials and the like—is already the cheapest, fastest way to reduce emissions. Improved efficiency has made great progress in recent years but will have to accelerate, especially in emerging economies such as China and India.

Also necessary will be breakthroughs in so-called “carbon negative” technologies. Call it the photosynthesis option: because plants inhale carbon dioxide and store it in their roots, stems, and leaves, one can remove carbon from the atmosphere by growing trees, planting cover crops, burying charred plant materials underground, and other kindred methods. In effect, carbon negative technologies can turn back the clock on global warming, making the aforementioned descent from the 2 C overshoot to the 1.5 C goal later in this century theoretically possible. Carbon-negative technologies thus far remain unproven at the scale needed, however; more research and deployment is required, according to the study.

*   *   *

Earth’s Most Famous Climate Scientist Issues Bombshell Sea Level Warning (Slate)

495456719-single-family-homes-on-islands-and-condo-buildings-on

Monday’s new study greatly increases the potential for catastrophic near-term sea level rise. Here, Miami Beach, among the most vulnerable cities to sea level rise in the world. Photo by Joe Raedle/Getty Images

In what may prove to be a turning point for political action on climate change, a breathtaking new study casts extreme doubt about the near-term stability of global sea levels.

The study—written by James Hansen, NASA’s former lead climate scientist, and 16 co-authors, many of whom are considered among the top in their fields—concludes that glaciers in Greenland and Antarctica will melt 10 times faster than previous consensus estimates, resulting in sea level rise of at least 10 feet in as little as 50 years. The study, which has not yet been peer-reviewed, brings new importance to a feedback loop in the ocean near Antarctica that results in cooler freshwater from melting glaciers forcing warmer, saltier water underneath the ice sheets, speeding up the melting rate. Hansen, who is known for being alarmist and also right, acknowledges that his study implies change far beyond previous consensus estimates. In a conference call with reporters, he said he hoped the new findings would be “substantially more persuasive than anything previously published.” I certainly find them to be.

To come to their findings, the authors used a mixture of paleoclimate records, computer models, and observations of current rates of sea level rise, but “the real world is moving somewhat faster than the model,” Hansen says.

Hansen’s study does not attempt to predict the precise timing of the feedback loop, only that it is “likely” to occur this century. The implications are mindboggling: In the study’s likely scenario, New York City—and every other coastal city on the planet—may only have a few more decades of habitability left. That dire prediction, in Hansen’s view, requires “emergency cooperation among nations.”

We conclude that continued high emissions will make multi-meter sea level rise practically unavoidable and likely to occur this century. Social disruption and economic consequences of such large sea level rise could be devastating. It is not difficult to imagine that conflicts arising from forced migrations and economic collapse might make the planet ungovernable, threatening the fabric of civilization.

The science of ice melt rates is advancing so fast, scientists have generally been reluctant to put a number to what is essentially an unpredictable, nonlinear response of ice sheets to a steadily warming ocean. With Hansen’s new study, that changes in a dramatic way. One of the study’s co-authors is Eric Rignot, whose own study last year found that glacial melt from West Antarctica now appears to be “unstoppable.” Chris Mooney, writing for Mother Jonescalled that study a “holy shit” moment for the climate.

One necessary note of caution: Hansen’s study comes via a nontraditional publishing decision by its authors. The study will be published in Atmospheric Chemistry and Physics, an open-access “discussion” journal, and will not have formal peer review prior to its appearance online later this week. [Update, July 23: The paper is now available.] The complete discussion draft circulated to journalists was 66 pages long, and included more than 300 references. The peer review will take place in real time, with responses to the work by other scientists also published online. Hansen said this publishing timeline was necessary to make the work public as soon as possible before global negotiators meet in Paris later this year. Still, the lack of traditional peer review and the fact that this study’s results go far beyond what’s been previously published will likely bring increased scrutiny. On Twitter, Ruth Mottram, a climate scientist whose work focuses on Greenland and the Arctic, was skeptical of such enormous rates of near-term sea level rise, though she defended Hansen’s decision to publish in a nontraditional way.

In 2013, Hansen left his post at NASA to become a climate activist because, in his words, “as a government employee, you can’t testify against the government.” In a wide-ranging December 2013 study, conducted to support Our Children’s Trust, a group advancing legal challenges to lax greenhouse gas emissions policies on behalf of minors, Hansen called for a “human tipping point”—essentially, a social revolution—as one of the most effective ways of combating climate change, though he still favors a bilateral carbon tax agreed upon by the United States and China as the best near-term climate policy. In the new study, Hansen writes, “there is no morally defensible excuse to delay phase-out of fossil fuel emissions as rapidly as possible.”

Asked whether Hansen has plans to personally present the new research to world leaders, he said: “Yes, but I can’t talk about that today.” What’s still uncertain is whether, like with so many previous dire warnings, world leaders will be willing to listen.

*   *   *

Ice Melt, Sea Level Rise and Superstorms (Climate Sciences, Awareness and Solutions / Earth Institute, Columbia University)

23 July 2015

James Hansen

The paper “Ice melt, sea level rise and superstorms: evidence from paleoclimate data, climate modeling, and modern observations that 2°C global warming is highly dangerous” has been published in Atmospheric Chemistry and Physics Discussion and is freely available here.

The paper draws on a large body of work by the research community, as indicated by the 300 references. No doubt we missed some important relevant contributions, which we may be able to rectify in the final version of the paper. I thank all the researchers who provided data or information, many of whom I may have failed to include in the acknowledgments, as the work for the paper occurred over a several year period.

I am especially grateful to the Durst family for a generous grant that allowed me to work full time this year on finishing the paper, as well as the other supporters of our program Climate Science, Awareness and Solutions at the Columbia University Earth Institute.

In the conceivable event that you do not read the full paper plus supplement, I include the Acknowledgments here:

Acknowledgments. Completion of this study was made possible by a generous gift from The Durst Family to the Climate Science, Awareness and Solutions program at the Columbia University Earth Institute. That program was initiated in 2013 primarily via support from the Grantham Foundation for Protection of the Environment, Jim and Krisann Miller, and Gerry Lenfest and sustained via their continuing support. Other substantial support has been provided by the Flora Family Foundation, Dennis Pence, the Skoll Global Threats Fund, Alexander Totic and Hugh Perrine. We thank Anders Carlson, Elsa Cortijo, Nil Irvali, Kurt Lambeck, Scott Lehman, and Ulysses Ninnemann for their kind provision of data and related information. Support for climate simulations was provided by the NASA High-End Computing (HEC) Program through the NASA Center for Climate Simulation (NCCS) at Goddard Space Flight Center.

Climate models are even more accurate than you thought (The Guardian)

The difference between modeled and observed global surface temperature changes is 38% smaller than previously thought

Looking across the frozen sea of Ullsfjord in Norway.  Melting Arctic sea ice is one complicating factor in comparing modeled and observed surface temperatures.

Looking across the frozen sea of Ullsfjord in Norway. Melting Arctic sea ice is one complicating factor in comparing modeled and observed surface temperatures. Photograph: Neale Clark/Robert Harding World Imagery/Corbis

Global climate models aren’t given nearly enough credit for their accurate global temperature change projections. As the 2014 IPCC report showed, observed global surface temperature changes have been within the range of climate model simulations.

Now a new study shows that the models were even more accurate than previously thought. In previous evaluations like the one done by the IPCC, climate model simulations of global surface air temperature were compared to global surface temperature observational records like HadCRUT4. However, over the oceans, HadCRUT4 uses sea surface temperatures rather than air temperatures.

A depiction of how global temperatures calculated from models use air temperatures above the ocean surface (right frame), while observations are based on the water temperature in the top few metres (left frame). Created by Kevin Cowtan.

A depiction of how global temperatures calculated from models use air temperatures above the ocean surface (right frame), while observations are based on the water temperature in the top few metres (left frame). Created by Kevin Cowtan.

Thus looking at modeled air temperatures and HadCRUT4 observations isn’t quite an apples-to-apples comparison for the oceans. As it turns out, sea surface temperatures haven’t been warming fast as marine air temperatures, so this comparison introduces a bias that makes the observations look cooler than the model simulations. In reality, the comparisons weren’t quite correct. As lead author Kevin Cowtan told me,

We have highlighted the fact that the planet does not warm uniformly. Air temperatures warm faster than the oceans, air temperatures over land warm faster than global air temperatures. When you put a number on global warming, that number always depends on what you are measuring. And when you do a comparison, you need to ensure you are comparing the same things.

The model projections have generally reported global air temperatures. That’s quite helpful, because we generally live in the air rather than the water. The observations, by mixing air and water temperatures, are expected to slightly underestimate the warming of the atmosphere.

The new study addresses this problem by instead blending the modeled air temperatures over land with the modeled sea surface temperatures to allow for an apples-to-apples comparison. The authors also identified another challenging issue for these model-data comparisons in the Arctic. Over sea ice, surface air temperature measurements are used, but for open ocean, sea surface temperatures are used. As co-author Michael Mann notes, as Arctic sea ice continues to melt away, this is another factor that accurate model-data comparisons must account for.

One key complication that arises is that the observations typically extrapolate land temperatures over sea ice covered regions since the sea surface temperature is not accessible in that case. But the distribution of sea ice changes seasonally, and there is a long-term trend toward decreasing sea ice in many regions. So the observations actually represent a moving target.

A depiction of how as sea ice retreats, some grid cells change from taking air temperatures to taking water temperatures. If the two are not on the same scale, this introduces a bias.  Created by Kevin Cowtan.

A depiction of how as sea ice retreats, some grid cells change from taking air temperatures to taking water temperatures. If the two are not on the same scale, this introduces a bias. Created by Kevin Cowtan.

When accounting for these factors, the study finds that the difference between observed and modeled temperatures since 1975 is smaller than previously believed. The models had projected a 0.226°C per decade global surface air warming trend for 1975–2014 (and 0.212°C per decade over the geographic area covered by the HadCRUT4 record). However, when matching the HadCRUT4 methods for measuring sea surface temperatures, the modeled trend is reduced to 0.196°C per decade. The observed HadCRUT4 trend is 0.170°C per decade.

So when doing an apples-to-apples comparison, the difference between modeled global temperature simulations and observations is 38% smaller than previous estimates. Additionally, as noted in a 2014 paper led by NASA GISS director Gavin Schmidt, less energy from the sun has reached the Earth’s surface than anticipated in these model simulations, both because solar activity declined more than expected, and volcanic activity was higher than expected. Ed Hawkins, another co-author of this study, wrote about this effect.

Combined, the apparent discrepancy between observations and simulations of global temperature over the past 15 years can be partly explained by the way the comparison is done (about a third), by the incorrect radiative forcings (about a third) and the rest is either due to climate variability or because the models are slightly over sensitive on average. But, the room for the latter effect is now much smaller.

Comparison of 84 climate model simulations (using RCP8.5) against HadCRUT4 observations (black), using either air temperatures (red line and shading) or blended temperatures using the HadCRUT4 method (blue line and shading). The upper panel shows anomalies derived from the unmodified climate model results, the lower shows the results adjusted to include the effect of updated forcings from Schmidt et al. (2014).

Comparison of 84 climate model simulations (using RCP8.5) against HadCRUT4 observations (black), using either air temperatures (red line and shading) or blended temperatures using the HadCRUT4 method (blue line and shading). The upper panel shows anomalies derived from the unmodified climate model results, the lower shows the results adjusted to include the effect of updated forcings from Schmidt et al. (2014).

As Hawkins notes, the remaining discrepancy between modeled and observed temperatures may come down to climate variability; namely the fact that there has been a preponderance of La Niña events over the past decade, which have a short-term cooling influence on global surface temperatures. When there are more La Niñas, we expect temperatures to fall below the average model projection, and when there are more El Niños, we expect temperatures to be above the projection, as may be the case when 2015 breaks the temperature record.

We can’t predict changes in solar activity, volcanic eruptions, or natural ocean cycles ahead of time. If we want to evaluate the accuracy of long-term global warming model projections, we have to account for the difference between the simulated and observed changes in these factors. When the authors of this study did so, they found that climate models have very accurately projected the observed global surface warming trend.

In other words, as I discussed in my book and Denial101x lecture, climate models have proven themselves reliable in predicting long-term global surface temperature changes. In fact, even more reliable than I realized.

Denial101x climate science success stories lecture by Dana Nuccitelli.

There’s a common myth that models are unreliable, often based on apples-to-oranges comparisons, like looking at satellite estimates of temperatures higher in the atmosphere versus modeled surface air temperatures. Or, some contrarians like John Christy will only consider the temperature high in the atmosphere, where satellite estimates are less reliable, and where people don’t live.

This new study has shown that when we do an apples-to-apples comparison, climate models have done a good job projecting the observed temperatures where humans live. And those models predict that unless we take serious and immediate action to reduce human carbon pollution, global warming will continue to accelerate into dangerous territory.

Sabesp considera fim do Cantareira e corre contra o tempo (Exame)

JC, 5201, 22 de junho de 2015

A crise da água em São Paulo ainda não acabou

Depois que a seca do ano passado deixou São Paulo à beira de um racionamento severo de água, as chuvas do final do verão deram à Sabesp – a grande culpada pela crise, segundo autoridades municipais – uma segunda chance para aumentar investimentos em infraestrutura.

Com o início da estação seca, há uma corrida contra o tempo para desviar rios e conectar sistemas antes que os já prejudicados reservatórios de água fiquem baixos novamente.

A corrida contra o tempo ressalta a situação precária da maior metrópole da América do Sul após duas décadas sem nenhum grande projeto hídrico.

Os reservatórios ainda não se recuperaram da seca do ano passado e os meteorologistas estão prevendo meses mais quentes à frente por causa do fenômeno climático El Niño.

“A infraestrutura não foi a prioridade da Sabesp nos últimos anos. Eles não adotaram medidas para evitar a crise”, disse Pedro Caetano Mancuso, diretor do Centro de Referência em Segurança da Água da Universidade de São Paulo.

“Embora a Sabesp esteja disposta a fazer a lição de casa agora, a questão é se ela será concluída ou não a tempo de evitar um problema ainda maior”.

A Sabesp – empresa sob controle estatal -,disse que foi a severidade da seca do ano passado, e não a falta de investimentos em infraestrutura, a causa da crise.

“Nós estávamos preparados para uma seca tão ruim ou pior que a de 1953”, quando a Sabesp enfrentou uma crise similar, disse o presidente Jerson Kelman a vereadores, em uma audiência no dia 13 de maio.

“O que aconteceu em 2014 foi que tivemos metade do volume de chuva daquele ano. Para isso, nós não estávamos preparados”.

‘Previsível’

Em um relatório, em 10 de junho, a Câmara de Vereadores de São Paulo culpou a Sabesp pela crise que cortou o abastecimento em alguns bairros, dizendo que a seca já era previsível.

“Se a Sabesp tivesse investido os dividendos distribuídos na Bolsa de Nova York em obras para modernizar os sistemas que abastecem a capital e na manutenção da rede, não estaríamos enfrentando o racionamento travestido de redução de pressão”, disse Laércio Benko, vereador que liderou a comissão criada para investigar a escassez no abastecimento de água em São Paulo.

O maior dos projetos de infraestrutura que a Sabesp necessita neste ano para garantir o fornecimento de água potável está atrasado.

O projeto para conectar o Rio Pequeno ao reservatório da Billings, originalmente programado para ser concluído em maio, não será terminado até agosto devido a atrasos nas licenças ambientais e de uso da terra, disse a assessoria de imprensa da Sabesp em uma resposta a perguntas por e-mail. Se concluído neste ano, o pacote de cinco obras de emergência em que a Sabesp está investindo seria suficiente para evitar o racionamento, segundo a empresa.

Reservatório principal

Sem os projetos, e se as chuvas ficarem no nível do ano passado ou abaixo dele, a Sabesp projeta que seu reservatório principal – conhecido como Cantareira – poderá secar até agosto, segundo projeções internas obtidas pela Bloomberg News.

No pior cenário previsto pela empresa, poderá haver cortes no abastecimento de água na maior parte da área metropolitana de São Paulo cinco dias por semana, segundo o documento, que foi preparado como parte de um plano de contingência para São Paulo.

A Sabesp disse no e-mail que as chuvas, até agora, têm sido positivas. Para acelerar os investimentos de emergência agora, a Sabesp está cortando gastos e aumentando os preços da água. A empresa reduzirá os gastos com coleta e tratamento de esgoto pela metade neste ano, disseram executivos em uma teleconferência com investidores em abril. O aumento de tarifa reflete o “estresse financeiro” da Sabesp, disse o diretor financeiro Rui Affonso na conferência.

Queda das ações

As ações da Sabesp caíram 4,8 por cento na segunda-feira, pior desempenho das negociações em São Paulo, depois que a Federação das Indústrias do Estado de São Paulo (Fiesp) afirmou ter entrado com uma liminar para impedir o aumento de tarifa.

“A seca do ano passado será totalmente sentida nos resultados deste ano”, disse Alexandre Montes, analista de ações da Lopes Filho Associados Consultores de Investimentos, em entrevista por telefone, do Rio. “Mesmo se a seca diminuir agora, e mesmo se tudo sair bem, os resultados da Sabesp vão cair”.

(Revista Exame)

There never was a global warming ‘pause,’ NOAA study concludes (Environment & Energy Publishing)

Gayathri Vaidyanathan, E&E reporter

Published: Friday, June 5, 2015

The global warming “pause” does not exist, according to scientists at the National Oceanic and Atmospheric Administration.

Their finding refutes a theory that has dominated climate science in recent years. The Intergovernmental Panel on Climate Change (IPCC) in 2013 found that global temperatures in recent years have not risen as quickly as they did in the 20th century. That launched an academic hunt for the missing heat in the oceans, volcanoes and solar rays. Meanwhile, climate deniers triumphantly crowed that global warming has paused or gone on a “hiatus.”

But it now appears that the pause never was. NOAA scientists have fixed some small errors in global temperature data and found that temperatures over the past 15 years have been rising at a rate comparable to warming over the 20th century. The study was published yesterday inScience.

That a minor change to the analysis can switch the outcome from a hiatus to increased warming shows “how fragile a concept it [the hiatus] was in the first place,” said Gavin Schmidt, director of the NASA Goddard Institute for Space Studies, who was unaffiliated with the study.

According to the NOAA study, the world has warmed since 1998 by 0.11 degree Celsius per decade. Scientists had previously calculated that the trend was about half that.

The new rate is equal to the rate of warming seen between 1951 and 1999.

There has been no slowdown in the rate of global warming, said Thomas Karl, director of NOAA’s National Centers for Environmental Information and lead author of the study.

“Global warming is firmly entrenched on our planet, and it continues to progress and is likely to continue to do so in the future unless emissions of greenhouse gases are substantially altered,” he said.

Errors from weather stations, buoys and buckets

That NOAA has to adjust temperature readings is not unusual. Many factors can affect raw temperature measurements, according to a study by Karl in 1988.

For instance, a weather station may be situated beneath a tree, which would bias temperatures low. Measurements made near a parking lot would read warm due to the waves of heat emanating from asphalt surfaces. NOAA and other agencies adjust the raw temperature data to remove such biases.

It has become clear in recent years that some biases still persist in the data, particularly of ocean temperatures. The culprit: buckets.

Ships traverse the world, and, occasionally, workers onboard dip a bucket over the hull and bring up water that they measure using a thermometer. The method is old school and error prone — water in a bucket is usually cooler than the ocean.

For a long time, scientists had assumed that most ships no longer use buckets and instead measure water siphoned from the ocean to cool ship engines. The latter method is more robust. But data released last year showed otherwise and compelled NOAA to correct for this bias.

A second correction involved sensor-laden buoys interspersed across the oceans whose temperature readings are biased low. Karl and his colleagues corrected for this issue, as well.

The corrections “made a significant impact,” Karl said. “They added about 0.06 degrees C per decade additional warming since 2000.”

The ‘slowdown hasn’t gone away’

What that means for the global warming hiatus depends on whom you ask. The warming trend over the past 15 years is comparable to the trend between 1950 and 1998 (a 48-year stretch), which led Karl to say that global warming never slowed.

Other scientists were not fully convinced. For a truly apples-to-apples comparison, the past 15 years should be compared with other 15-year stretches, said Peter Stott, head of the climate monitoring and attribution team at the U.K. Met Office.

For instance, the globe warmed more slowly in the past 15 years than between 1983 and 1998 (the previous 15-year stretch), even with NOAA’s new data corrections, Stott said.

“The slowdown hasn’t gone away,” he said in an email. “While the Earth continues to accumulate energy as a result of increasing man-made greenhouse gas emissions … global temperatures have not increased smoothly.”

The disagreements arise because assigning trends — including the trend of a “hiatus” — to global warming depends on the time frame of reference.

“Trends based on short records are very sensitive to the beginning and end dates and do not in general reflect long-term climate trends,” the IPCC stated in 2013, even as it discussed the pause.

Robert Kaufmann, an environment professor at Boston University who was unaffiliated with the study, called trends a “red herring.”

A trend implies that the planet will warm, decade after decade, at a steady clip. There is no reason why that should be the case, Kaufmann said. Many factors — human emissions of warming and cooling gases, natural variability, and external factors such as the sun — feed into Earth’s climate. The relative contributions of each factor can vary by year, decade, century or on even larger time scales.

“There is no scientific basis to assume that the climate is going to warm at the same rate year after year, decade after decade,” he said.

Copying the language of skeptics

Trends are a powerful weapon in the hands of climate deniers. As early as 2006, deniers used the slowdown of warming from 1998 onward to say that global warming had stopped or paused.

The idea of a “pause” seeped into academia, launching dozens of studies into what might have caused it. But there was a subtle difference between scientists’ understanding of the pause and that of the skeptics; scientists never believed that warming had stopped, only that it had slowed compared with the rapidly warming ’90s. They wanted to know why.

Over the years, scientists have unraveled the contributions of volcanoes to global cooling, the increased uptake of heat by the Pacific Ocean, the cooling role of La Niñas and other drivers of natural variability. Their understanding of our planet’s climate evolved rapidly.

As scientists wrote up their findings, they unwittingly adopted the skeptics’ language of the “pause,” said Stephan Lewandowsky, a psychologist at the University of Bristol who was unaffiliated with the NOAA study. That was problematic.

“That’s sort of a subtle semantic thing, but it is really important because it suggests that these [scientists] bought into the existence of the hiatus,” he said.

Then, in 2013, the IPCC wrote about the pause. The German government complained that the term implies that warming had stopped, which is inaccurate. The objection was ignored.

NOAA’s strong refutation of the hiatus is particularly weighty because it comes from a government lab, and the work was headed by Karl, a pioneer of temperature reanalysis studies.

NOAA will be using the data corrections to assess global temperatures from July onward, Karl said. NASA is discussing internally whether to apply the fixes suggested in the study, according to Schmidt of NASA.

The study was greeted by Democrats in Congress as proof that climate change is real. Sen. Barbara Boxer (D-Calif.), ranking member of the Environment and Public Works Committee, used it as an opportunity to chide her opponents.

“Climate change deniers in Congress need to stop ignoring the fact that the planet may be warming at an even faster rate than previously observed, and we must take action now to reduce dangerous carbon pollution,” she said in a statement.

Cemaden faz nova projeção da reserva do Cantareira no período de seca (MCTI/Cemaden)

Levantamento do Centro Nacional de Monitoramento e Alertas de Desastres Naturais indica chuvas e reservas abaixo da média histórica até dezembro

O Centro Nacional de Monitoramento e Alertas de Desastres Naturais (Cemaden/MCTI) aponta no último relatório, publicado na quarta-feira (27), as situações críticas do Reservatório do Sistema Cantareira, indicando chuvas e reservas abaixo da média histórica, até dezembro deste ano.

Essa situação ocorrerá mesmo com a inclusão dos dados da diminuição da captação de água do reservatório, prevista para os meses de setembro até novembro, anunciada pelo Comunicado Conjunto da Agência Nacional de Água (ANA) e do Departamento de Águas e Energia Elétrica (DAEE), na última semana de maio.

Com base nas redes pluviométricas do Cemaden e do DAEE, cobrindo as sub-bacias de captação do Sistema Cantareira, durante o período de outubro de 2014 a março de 2015, a precipitação média espacial acumulada foi de 879 milímetros (mm), equivalente a 73,5% da média climatológica, registrada em 1.161 mm para o mesmo período.

A precipitação média espacial acumulada no mês de abril de 2015 foi de 52,4 mm, representando 58,4% da média climatológica do mês, registrado em 89,83 mm. A chuva acumulada no período de 1º até 29 de maio de 2015 foi registrada com uma precipitação média de 55,3 mm, que representa 70,7% do total de chuvas da média histórica do mesmo período, registrada em 78,2 mm. No relatório, também são indicados os valores da precipitação média dos dados da Companhia de Saneamento Básico do Estado de São Paulo (Sabesp), que têm algumas variações com relação aos dados do Cemaden.

Na situação atual, a vazão média do Sistema Cantareira, ou seja, o cálculo entre o volume de água e o seu reabastecimento com as chuvas, está abaixo da média climatológica. A vazão média afluente ao Sistema Cantareira no mês de maio foi de 14,02 metros cúbicos por segundo (m3/s), ou seja, 63,4% abaixo da vazão média mensal de 38,27 m3/s. Também está abaixo da vazão mínima histórica de 19,90 m3/s, representando apenas 29,5% do total da média histórica.

Projeções

O relatório do cenário hídrico do Sistema Cantareira, divulgado, periodicamente, desde janeiro de 2015, tem os cálculos das projeções da vazão afluente no modelo hidrológico, implementado pelo Cemaden, com base na previsão de chuva do Centro de Previsão de Tempo e Estudos Climáticos (CPTEC) do Inpe para sete dias. A partir do oitavo dia, são apresentadas projeções com base em cinco cenários de chuvas (na média histórica, 25% e 50% abaixo e acima da média). Finalmente, considerando um cenário de extração ou captação de água do Sistema Cantareira são obtidas as projeções da evolução do armazenamento.

No último relatório, considerou-se a extração total do Sistema Cantareira igual a 17,0 m³ por segundo no período de 1º de junho a 31 de agosto e também no mês de dezembro de 2015. No período de 1º de setembro a 30 de novembro, considerou-se a captação de água dos reservatórios igual a 13,5 m³ por segundo.

No cenário de precipitações pluviométricas na média climatológica, no final da estação seca, início de outubro, o volume armazenado seria de 188,66 milhões de m3, aproximadamente. “Esse volume armazenado representa 14,9% da reserva total do Cantareira, ou seja, a soma do volume útil e os dois volumes mortos, com o total estimado em 1.269,5 milhões de m³”, destaca a hidróloga do Cemaden Adriana Cuartas, responsável pelo relatório do Cantareira.

Nesse cenário de precipitações dentro da média histórica, no dia 1º de dezembro de 2015, o volume armazenado seria, aproximadamente, de 227,72 milhões de m³, que representaria 17,9% do volume da reserva total do Cantareira.

Para um cenário de precipitações pluviométricas iguais à média climatológica, o chamado volume morto 1 seria recuperado ao longo da última semana de dezembro, aproximadamente. Considerando o cenário de chuvas 25% acima da média climatológica, o volume morto 1 seria recuperado na última semana de novembro.

Acesse o documento.

(MCTI, via Cemaden)

Previsão do clima: terremotos intermitentes (Folha de S.Paulo)

Marcelo Leite, 03/05/2015  01h57

Depois de Katmandu, o terremoto no Nepal sacudiu também uma noção preconcebida comum entre jornalistas de ciência – esta coluna, por exemplo, foi abalada por um tuíte de Matthew Shirts, que levava para uma reportagem da revista “Newsweek”.

A leitura do texto, “Mais Terremotos Fatais Virão, Alertam Cientistas da Mudança do Clima”, trouxe à memória um momento constrangedor. Que o relato sirva para desencorajar nossa tendência a acreditar em verdades estabelecidas.

Certa vez um colega de redação perguntou se eu poderia escrever para explicar por que tsunamis estavam se tornando mais frequentes e qual era a relação disso com o aquecimento global. Segurei a vontade de rir e expliquei, condescendente, que processos climáticos não tinham o poder de desencadear eventos geológicos.

Não é bem assim. Há pesquisadores respeitáveis investigando a hipótese de que a mudança climática deflagrada pelo aquecimento global possa, sim, tornar terremotos e erupções vulcânicas mais frequentes.

Não seria nada inédito na história da Terra. Um exemplo recentíssimo na escala geológica (o planeta tem mais de 4 bilhões de anos) ocorreu entre 20 mil e 12 mil anos atrás, ao término do último período glacial.

A retração de geleiras continentais com quilômetros de espessura aliviou a pressão sobre a crosta terrestre o bastante para desencadear intensa atividade vulcânica. Há boas evidências disso em lugares como a Islândia.

O geólogo britânico Bill McGuire tem uma teoria ainda mais preocupante. Ele acha que a elevação dos mares em 100 m, causada pelo derretimento das calotas de gelo, teria deflagrado também terremotos e tsunamis (o que poderia repetir-se a partir de agora, com o aquecimento da atmosfera).

O imenso volume de água adicionado aos oceanos, ao pressionar suas bordas, teria desestabilizado as falhas geológicas próximas da costa, causando os tremores e colapsos submarinos que levantam ondas colossais. Mas a hipótese de McGuire, detalhada no livro “Acordando o Gigante”, ainda carece de medições e dados para ser aceita.

No caso do terremoto de Katmandu, o mecanismo pressuposto para pôr a culpa no clima é outro: chuva. Não uma pancada qualquer, mas as poderosas monções que castigam Índia e Nepal de junho a agosto.

Tamanho volume de água, que perde só para o movimentado na bacia Amazônica, seria capaz de alterar o balanço do estresse entre as placas Indo-Australiana e Asiática. O geólogo argelino Pierre Bettinelli, então no CalTech, mostrou que a atividade sísmica nos Himalaias é duas vezes mais intensa no inverno e atribuiu isso à gangorra de pressões entre os dois lados da falha tectônica.

Falta provar, claro. Mas que é instigante, isso é.

Quanto a terremotos causados pelo aquecimento global, ninguém precisa sair comprando kits de sobrevivência. O degelo da última glaciação demorou milhares de anos, e as piores previsões para a subida no nível dos oceanos indicam não muito mais que 1 m ou 2 m até o final deste século.

Ninguém está a salvo de tsunamis, porém. Há alguma chance – uma vez a cada 10 mil anos, talvez – de que o litoral brasileiro seja atingido por um deles, como pode ter ocorrido com São Vicente em 1541, após cataclisma nalgum ponto do Atlântico.

Clima marombado (Folha de S.Paulo)

Marcelo Leite, 31/05/2015  01h45

Como o jornal anda cheio de notícias boas, esta coluna retoma sua predileção desmesurada pelas más novas impopulares e anuncia: 2015 caminha para ser dos infernos também na esfera do clima.

É provável, por exemplo, que este ano bata o recorde de temperatura global. A marca estava antes, veja só, com 2014. Os dez anos mais escaldantes ocorreram todos depois de 1998.

Um dos que acreditam no novo recorde é o alemão Stefan Rahmstorf. O climatologista do Instituto Potsdam de Pesquisa sobre Impacto do Clima, que ficou famoso em 2007 por criticar as previsões do IPCC como muito conservadoras, lançou sua predição para 20 jornalistas de 17 países reunidos em Berlim há 20 dias.

O período janeiro-abril de 2015 brindou o planeta com o primeiro quadrimestre mais quente já registrado desde 1880. O período de 12 meses compreendido entre maio de 2014 e abril de 2015 também foi o pior em matéria de calor.

Isso tudo já acontecia enquanto o fenômeno El Niño ainda era considerado fraco. Esse aquecimento anormal das águas do Pacífico na costa oeste sul-americana, que costuma abrasar o clima mundial, ganhou impulso neste mês de maio e deve permanecer até o segundo semestre.

Notícia péssima para o Nordeste brasileiro. O semiárido tem bolsões que enfrentam o quarto ano seguido de seca. Entre os efeitos mais conhecidos de um El Niño está exatamente a diminuição das chuvas nessa região do Brasil (assim como o aumento da precipitação no Sul).

Pior é a situação na Índia. Até sexta-feira (29), uma onda de calor –a pior em duas décadas, com temperaturas de 47 graus Celsius– havia causado mais de 2.000 mortes. E o El Niño pode atrasar e enfraquecer as monções, chuvas torrenciais que começam em junho e poderiam refrescar o segundo país mais populoso do mundo.

Enquanto indianos torram, amazonenses estão debaixo d’água. A cheia do rio Negro, também ela perto de bater recordes, já atrapalhou a vida de 238 mil pessoas em 33 municípios do Estado do Amazonas.

O governo estadual se limita a medidas de remediação. Mais de 450 toneladas de alimentos não perecíveis foram distribuídas, assim como “kits dormitório” (colchões, redes e mosquiteiros) e “kits de higiene pessoal” para milhares de desabrigados.

Também foram destinados às cidades atingidas 68 metros cúbicos de madeira e 750 kits de tábuas, caibros e ripões para os moradores construírem passarelas elevadas conhecidas como “marombas”.

Essa enchente provavelmente nada tem a ver com o El Niño, e também seria difícil demonstrar um nexo causal entre a onda de calor indiana e a anomalia no Pacífico. Os dois eventos constituem bons exemplos, contudo, das situações extremas que a mudança do clima em curso deverá tornar mais frequentes nas próximas décadas.

Pelo andar da carruagem das negociações internacionais, parece cada vez mais difícil, se não impossível, que se consiga evitar um aquecimento global maior que 2 graus Celsius neste século. Esse é o limite de segurança indicado pelo IPCC.

A mudança do clima está contratada. Não resta muito mais que adaptar-se –e preparar a infraestrutura das cidades para ela exigirá muito mais do que marombas improvisadas.

So far, most atolls winning the sea level rise battle (Pacific Institute of Public Policy)

So far, most atolls winning the sea level rise battle

An increasing number of atoll studies are not supporting claims of Pacific island leaders that “islands are sinking.” Scientific studies published this year show, for example, that land area in Tuvalu’s capital atoll of Funafuti grew seven percent over the past century despite significant sea level rise. Another study reported that 23 of 27 atoll islands across Kiribati, Tuvalu and the Federated States of Micronesia either increased in area or remained stable over recent decades.

Speaking about Kiribati, Canadian climatologist Simon Donner commented in the Scientific American: ‘Right now it is clear that no one needs to immediately wall in the islands or evacuate all the inhabitants. What the people of Kiribati and other low-lying countries need instead are well-thought-out, customized adaption plans and consistent international aid — not a breathless rush for a quick fix that makes the rest of the world feel good but obliges the island residents to play the part of helpless victim.’

These same climate scientists who are conducting ongoing research in Tuvalu, Kiribati and the Marshall Islands acknowledge the documented fact of sea level rise in the Pacific, and the potential threat this poses. But they are making the point, as articulated by Donner, that ‘the politicized public discourse on climate change is less nuanced than the science of reef islands.’

A recent report carried in Geology, the publication of the Geological Society of America, says Tuvalu has experienced ‘some of the highest rates of sea level rise over the past 60 years.’ At the same time, ‘no islands have been lost, the majority have enlarged, and there has been a 7.3 percent increase in net island area over the past century.’

To gain international attention to climate concerns and motivate funding to respond to what is described as climate damage, political leaders from the Pacific are predicting dire consequences.

The future viability of the Marshall Islands — and all island nations — is at stake,’ Marshall Islands Foreign Minister Tony deBrum told the global climate meeting in Peru last December.

‘It keeps me awake at night,’ said Tuvalu Prime Minister Enele Sopoaga in a recent interview. ‘Will we survive? Or will we disappear under the sea?’

Obviously, statements of island leaders at international meetings and the observations of recent scientific reports are at odds. Does it matter?

Comments Donner: ‘Exaggeration, whatever its impetus, inevitably invites backlash, which is bad because it can prevent the nation from getting the right kind of help.’

If we want to grab headlines, the ‘disappearing island’ theme is good. But to find solutions to, for example, the increasing number of ocean inundations that are occurring requires well-thought out plans.

Scientists studying these low-lying islands should be seen as allies, whose information can be used to focus attention on key areas of need. For example, the New Zealand and Australian scientists working in Tuvalu said their results “show that islands can persist on reefs under rates of sea level rise on the order of five millimeters per year.” With sea level rates projected to double in the coming years, ‘it is unclear whether islands will continue to maintain their dynamic adjustment at these higher rates of change,’ they said. ‘The challenge for low-lying atoll nations is to develop flexible adaptation strategies that recognize the likely persistence of islands over the next century, recognize the different modes of island change, and accommodate the ongoing dynamism of island margins.’

Developing precise information on atoll nations as these scientists are doing is needed to inform policy makers and local residents as people are inundated with discussion about — and, possibly, outside donor funding for — ‘adaptation’ and ‘mitigation’ in these islands.

In the 1990s and early 2000s, the Nuclear Claims Tribunal in the Marshall Islands hired internationally recognized scientists and medical doctors to advise it on such things as radiation exposure standards for nuclear test clean up programs and medical conditions deserving of compensation, while evaluating U.S. government scientific studies on the Marshall Islands. These scientists and doctors provided knowledge and advice that helped inform the compensation and claims process.

It seems this nuclear test-related model would be of significant benefit to islands in the region, by linking independent climate scientists with island governments so there is a connection between science and climate policies and actions of governments.

If we want to grab headlines, the ‘disappearing island’ theme is good. But to find solutions to, for example, the increasing number of ocean inundations that are occurring requires well-thought out plans.

‘The reality is that the next few decades for low-lying reef islands will be defined by an unsexy, expensive slog to adapt,’ wrote Donner in the Scientific American. ‘Success will not come from single land purchase or limited-term aid projects. It will come from years of trial and error and a long-term investment by the international community in implementing solutions tailored to specific locales.’ He comments that a World Bank-supported adaptation program in Kiribati took eight years of consultation, training, policy development and identifying priorities to finally produce a plan of action. And even then, when they rolled out sea walls for several locations, there were design faults that need to be fixed. Donner’s observation about Kiribati could equally apply to the rest of the Pacific: “Responding to climate change in a place like Kiribati requires a sustained commitment to building local scientific and engineering capacity and learning from mistakes.”

It is excellent advice.

Image: Low-lying islands, such as Majuro Atoll pictured here, are changing due to storms, erosion, high tides, seawalls and causeways, and sea level rise. But few are disappearing. Photo credit: Isaac Marty

Recorde histórico de CO2 (Observatório do Clima)

11/05/2015

Por Claudio Angelo, do OC –

A notícia correu o mundo nesta semana: a concentração de dióxido de carbono na atmosfera ultrapassou em março a marca simbólica de 400 partes por milhão, segundo anunciou a Noaa (Agência Nacional de Oceanos e Atmosfera dos EUA). É a primeira vez que isso acontece desde que a agência começou a medir esse gás em 40 pontos diferentes do planeta, na década de 1980.

Da última vez que houve tanto CO2 na atmosfera, provavelmente 3,5 milhões de anos atrás, não existiam seres humanos, nem gelo no polo Norte. A temperatura média global era de cerca de 3oC mais alta do que no período pré-industrial. O nível do mar era 4 a 5 metros mais alto do que hoje.

O anúncio foi tratado pela imprensa internacional como um “alerta vermelho” no ano da conferência do clima de Paris, que deveria (mas tem gente que acha que não vai) apontar o início da solução do problema do aquecimento global. Embora o recorde seja em si importante, o problema real é a tendência que ele indica.

Quatrocentas partes por milhão, ou ppm, é um número pequeno. Significa que, em cada milhão de moléculas de ar, há 400 de gás carbônico (lembre-se de que a atmosfera é composta quase totalmente de nitrogênio e oxigênio; o CO2 é um dos “gases-traço”, daqueles que juntos formam 1% da composição do ar).

Acontece que o gás carbônico faz o melhor estilo “chiquitito, pero cumplidor”: ele é extremamente eficiente em reter na atmosfera o calor que a Terra irradia em forma de radiação infravermelha. Não satisfeito, ele ajuda a elevar, por evaporação, os níveis atmosféricos de outro gás-estufa muito potente: o vapor d’água. Isso mesmo: como sua mãe já deve ter dito, até água em excesso faz mal.

As medições da concentração de CO2 na atmosfera começaram a ser feitas em 1958 pelo americano Charles Keeling no alto do vulcão Mauna Loa, no Havaí. O local foi escolhido por estar bem longe de fontes de poluição que pudessem enviesar as amostras de ar. O Mauna Loa, a 4.000 metros de altitude e no meio do Oceano Pacífico, representa bem como o CO2 está misturado à atmosfera global.

Quando as medições de Keeling começaram, a concentração de CO2 no ar estava em 315 ppm. Em 2013 elas ultrapassaram 400 ppm no Mauna Loa pela primeira vez, para caírem em seguida e fecharem o ano em 393 ppm. Os dados da Noaa mostram que o mesmo sinal foi detectado não apenas em um ponto, mas em dezenas de lugares diferentes mundo afora.

Assim como aconteceu em 2013, o valor vai cair nos próximos meses e fechar o ano abaixo de 400 ppm. A oscilação acontece porque no final do inverno no hemisfério Norte, onde está a maior parte das terras (portanto, da vegetação) do mundo, há muito carbono no ar. Ele vem da da decomposição das folhas que caíram no outono. Na primavera, a rebrota sequestra esse CO2 e a concentração cai novamente.

O problema, claro, é que essa concentração vem subindo de forma acelerada ano após ano. Em todo o período pré-industrial, a concentração de CO2 na atmosfera jamais ultrapassou 280 ppm. Do surgimento da espécie humana até o ano em que Keeling começou a fazer suas medições, o aumento foi de 12,5%, no máximo. Da primeira vitória do Brasil numa Copa do Mundo até hoje, o aumento já foi de outros 27%. A velocidade anual de crescimento dobrou entre 2000 e 2010 em relação a 1960-1970. Metade do aumento verificado desde a aurora da humanidade aconteceu depois de 1980.

A chamada "curva de Keeling", com o crescimento das concentrações de CO2 desde a década de 1950

 

Nesse ritmo, o CO2 terá dobrado em relação à era pré-industrial antes do final do século. Os modelos climáticos apontam que, com duas vezes mais CO2 no ar, o aumento da temperatura da Terra seria de cerca de 3oC, valor muito superior ao limite considerado “seguro” (e, para alguns, já inatingível) de 2oC acima da média pré-industrial. Segundo o IPCC, o painel do clima da ONU, para ter uma chance de 50% de atingir os 2oC, os níveis de CO2 precisariam estacionar em 450 ppm e depois cair.

Os 400 ppm são um número bizantino, mas importante por isso: apenas 50 ppm separam a humanidade de entrar em um território climático nunca antes explorado – e, ao que tudo indica, de forma alguma agradável. (Observatório do Clima/ #Envolverde)

* Publicado originalmente no site Observatório do Clima.

Sea level rise accelerating faster than thought (Science)

High tides swamp a playground in coastal Wales.

DIMITRIS LEGAKIS/SPLASH NEWS/NEWSCOM. High tides swamp a playground in coastal Wales.

If you’re still thinking about buying that beach house, think again. A new study suggests that sea levels aren’t just rising; they’re gaining ground faster than ever. That’s contrary to earlier work that suggested rising seas had slowed in recent years.

The result won’t come as a shock to most climate scientists. Long-term records from coastal tide gauges have shown that sea level rise accelerated throughout the 20th century. Models predict the trend will continue. However, previous studies based on satellite measurements—which began in 1993 and provide the most robust estimates of sea level—revealed that the rate of rise had slowed in the past decade compared with the one before.

That recent slowdown puzzled researchers, because sea level contributions from melting ice in Antarctica and Greenland are actually increasing, says Christopher Watson, a geodesist at the University of Tasmania in Australia. So he and colleagues took a closer look at the available satellite and tide gauge data, and tried to correct for other factors that might skew sea level measurements, like small changes in coastal elevation.

The results, published today in Nature Climate Change, show that global mean sea level rose slightly slower than previously thought between 1993 and 2014, but that sea level rise is indeed accelerating. The new findings agree more closely with other records of changing sea levels, like those produced by tide gauges and bottom-up accounting of the contributions from ocean warming and melting ice.

In the past, researchers have used tide gauges to keep tabs on the performance of satellite altimeters, which use radar to measure the height of the sea surface. The comparison allowed them to sniff out and cope with any issues that cropped up with the satellite sensors. Tide gauges themselves are not immune to problems, however; the land on which they rest can shift during earthquakes, or subside because of groundwater withdrawal or sediment settling. These processes can produce apparent changes in sea level that have nothing to do with the oceans.

So Watson’s team tried to correct for the rise and fall of tide gauge sites by using nearby GPS stations, which measure land motions. If no GPS stations were present, they used computer models to estimate known changes, such as how some regions continue to rebound from the last glaciation, when heavy ice sheets caused land to sink.

The newly recalibrated numbers show that the earliest part of the satellite record, collected between 1993 and 1999 by the first altimetry mission, known as TOPEX/Poseidon, appears to have overstated sea level rise. That’s probably because a sensor deteriorated, ultimately forcing engineers to turn on a backup instrument. When combined with data from subsequent satellite missions, those inflated TOPEX/Poseidon numbers gave the appearance that sea level rise was decelerating, even as the global climate warmed.

Also contributing to the apparent slowdown was a hiccup caused by natural climate variation, says John Church, a climate scientist at the Commonwealth Scientific and Industrial Research Organisation in Hobart, Australia, and a co-author of the new study. Around 2011, “there was a major dip in sea level associated with major flooding events in Australia and elsewhere,” he says. Intense rainfall transferred water from the oceans to the continents, temporarily overriding the long-term sea level trend.

The corrected record now shows that sea level rose 2.6 millimeters to 2.9 millimeters per year since 1993, compared with prior estimates of 3.2 millimeters per year. Despite the slower rates, the study found that sea level rise accelerated by an additional 0.04 millimeters per year, although the acceleration is not statistically significant. Watson says he expects that trend to grow stronger as researchers collect more data.

The acceleration falls in line with predictions from the Intergovernmental Panel on Climate Change (IPCC), Watson notes. “We’re tracking at that upper bound” of the IPCC’s business-as-usual scenario for greenhouse gas emissions, he says, which could bring up to one meter of sea level rise by 2100.

Others say it’s too early to tell. “The IPCC is looking way out in time,” says geodesist Steve Nerem of the University of Colorado, Boulder, who was not involved in the study. “This is only 20 years of data.”

In the meantime, Nerem says, the altimetry community needs to focus on continuing to improve the satellite data. He thinks Watson’s team “addressed it in the best way we can right now,” but it would be even better “to have a GPS receiver at every tide gauge, and right now that’s not the case.”

Regardless, the underlying message is clear, Church says: Sea levels are rising at ever increasing rates, and society needs to take notice.

Out of Place: Space/Time and Quantum (In)security (The Disorder of Things)

APRIL 21, 2015 – DRLJSHEPHERD

A demon lives behind my left eye. As a migraine sufferer, I have developed a very personal relationship with my pain and its perceived causes. On a bad day, with a crippling sensitivity to light, nausea, and the feeling that the blood flowing to my brain has slowed to a crawl and is the poisoned consistency of pancake batter, I feel the presence of this demon keenly.

On the first day of the Q2 Symposium, however, which I was delighted to attend recently, the demon was in a tricksy mood, rather than out for blood: this was a vestibular migraine. The symptoms of this particular neurological condition are dizziness, loss of balance, and sensitivity to motion. Basically, when the demon manifests in this way, I feel constantly as though I am falling: falling over, falling out of place. The Q Symposium, hosted by James Der Derian and the marvellous team at the University of Sydney’s Centre for International Security Studies,  was intended, over the course of two days and a series of presentations, interventions, and media engagements,  to unsettle, to make participants think differently about space/time and security, thinking through quantum rather than classical theory, but I do not think that this is what the organisers had in mind.

photo of cabins and corridors at Q Station, SydneyAt the Q Station, located in Sydney where the Q Symposium was held, my pain and my present aligned: I felt out of place, I felt I was falling out of place. I did not expect to like the Q Station. It is the former quarantine station used by the colonial administration to isolate immigrants they suspected of carrying infectious diseases. Its location, on the North Head of Sydney and now within the Sydney Harbour National Park, was chosen for strategic reasons – it is secluded, easy to manage, a passageway point on the journey through to the inner harbour – but it has a much longer historical relationship with healing and disease. The North Head is a site of Aboriginal cultural significance; the space was used by the spiritual leaders (koradgee) of the Guringai peoples for healing and burial ceremonies.

So I did not expect to like it, as such an overt symbol of the colonisation of Aboriginal lands, but it disarmed me. It is a place of great natural beauty, and it has been revived with respect, I felt, for the rich spiritual heritage of the space that extended long prior to the establishment of the Quarantine Station in 1835. When we Q2 Symposium participants were welcomed to country by and invited to participate in a smoking ceremony to protect us as we passed through the space, we were reminded of this history and thus reminded – gently, respectfully (perhaps more respectfully than we deserved) – that this is not ‘our’ place. We were out of place.

We were all out of place at the Q2 Symposium. That is the point. Positioning us thus was deliberate; we were to see whether voluntary quarantine would produce new interactions and new insights, guided by the Q Vision, to see how quantum theory ‘responds to global events like natural and unnatural disasters, regime change and diplomatic negotiations that phase-shift with media interventions from states to sub-states, local to global, public to private, organised to chaotic, virtual to real and back again, often in a single news cycle’. It was two days of rich intellectual exploration and conversation, and – as is the case when these experiments work – beautiful connections began to develop between those conversations and the people conversing, conversations about peace, security, and innovation, big conversations about space, and time.

I felt out of place. Mine is not the language of quantum theory. I learned so much from listening to my fellow participants, but I was insecure; as the migraine took hold on the first day, I was not only physically but intellectually feeling as though I was continually falling out of the moment, struggling to maintain the connections between what I was hearing and what I thought I knew.

Quantum theory departs from classical theory in the proposition of entanglement and the uncertainty principle:

This principle states the impossibility of simultaneously specifying the precise position and momentum of any particle. In other words, physicists cannot measure the position of a particle, for example, without causing a disturbance in the velocity of that particle. Knowledge about position and velocity are said to be complementary, that is, they cannot be precise at the same time.

I do not know anything about quantum theory – I found it hard to follow even the beginner’s guides provided by the eloquent speakers at the Symposium – but I know a lot about uncertainty. I also feel that I know something about entanglement, perhaps not as it is conceived of within quantum physics, but perhaps that is the point of events such as the Q Symposium: to encourage us to allow the unfamiliar to flow through and around us until the stream snags, to produce an idea or at least a moment of alternative cognition.

My moment of alternative cognition was caused by foetal microchimerism, a connection that flashed for me while I was listening to a physicist talk about entanglement. Scientists have shown that during gestation, foetal cells migrate into the body of the mother and can be found in the brain, spleen, liver, and elsewhere decades later. There are (possibly) parts of my son in my brain, literally as well as simply metaphorically (as the latter was already clear). I am entangled with him in ways that I cannot comprehend. Listening to the speakers discuss entanglement, all I could think was, This is what entanglement means to me, it is in my body.

Perhaps I am not proposing entanglement as Schrödinger does, as ‘the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought’. Perhaps I am just using the concept of entanglement to denote the inextricable, inexplicable, relationality that I have with my son, my family, my community, humanity. It is this entanglement that undoes me, to use Judith Butler’s most eloquent phrase, in the face of grief, violence, and injustice. Perhaps this is the value of the quantum: to make connections that are not possible within the confines of classical thought.

I am not a scientist. I am a messy body out of place, my ‘self’ apparently composed of bodies out of place. My world is not reducible. My uncertainty is vast. All of these things make me insecure, challenge how I move through professional time and space as I navigate the academy. But when I return home from my time in quarantine and joyfully reconnect with my family, I am grounded by how I perceive my entanglement. It is love, not science, that makes me a better scholar.

photo of sign that says 'laboratory and mortuary' from Q station, sydney.

I was inspired by what I heard, witnessed, discussed at the Q2 Symposium. I was – and remain – inspired by the vision of the organisers, the refusal to be bound by classical logics in any field that turns into a drive, a desire to push our exploration of security, peace, and war in new directions. We need new directions; our classical ideas have failed us, and failed humanity, a point made by Colin Wight during his remarks on the final panel at the Symposium. Too often we continue to act as though the world is our laboratory; we have ‘all these theories yet the bodies keep piling up…‘.

But if this is the case, I must ask: do we need a quantum turn to get us to a space within which we can admit entanglement, admit uncertainty, admit that we are out of place? We are never (only) our ‘selves’: we are always both wave and particle and all that is in between and it is our being entangled that renders us human. We know this from philosophy, from art and the humanities. Can we not learn this from art? Must we turn to science (again)? I felt diminished by the asking of these questions, insecure, but I did not feel that these questions were out of place.

Extending climate predictability beyond El Niño (Science Daily)

Date: April 21, 2015

Source: University of Hawaii – SOEST

Summary: Tropical Pacific climate variations and their global weather impacts may be predicted much further in advance than previously thought, according to research by an international team of climate scientists. The source of this predictability lies in the tight interactions between the ocean and the atmosphere and among the Atlantic, the Pacific and the Indian Oceans. Such long-term tropical climate forecasts are useful to the public and policy makers, researchers say.


This image shows inter-basin coupling as a cause of multi-year tropical Pacific climate predictability: Impact of Atlantic warming on global atmospheric Walker Circulation (arrows). Rising air over the Atlantic subsides over the equatorial Pacific, causing central Pacific sea surface cooling, which in turn reinforces the large-scale wind anomalies. Credit: Yoshimitsu Chikamoto

Tropical Pacific climate variations and their global weather impacts may be predicted much further in advance than previously thought, according to research by an international team of climate scientists from the USA, Australia, and Japan. The source of this predictability lies in the tight interactions between the ocean and the atmosphere and among the Atlantic, the Pacific and the Indian Oceans. Such long-term tropical climate forecasts are useful to the public and policy makers.

At present computer simulations can predict the occurrence of an El Niño event at best three seasons in advance. Climate modeling centers worldwide generate and disseminate these forecasts on an operational basis. Scientists have assumed that the skill and reliability of such tropical climate forecasts drop rapidly for lead times longer than one year.

The new findings of predictable climate variations up to three years in advance are based on a series of hindcast computer modeling experiments, which included observed ocean temperature and salinity data. The results are presented in the April 21, 2015, online issue of Nature Communications.

“We found that, even three to four years after starting the prediction, the model was still tracking the observations well,” says Yoshimitsu Chikamoto at the University of Hawaii at Manoa International Pacific Research Center and lead author of the study. “This implies that central Pacific climate conditions can be predicted over several years ahead.”

“The mechanism is simple,” states co-author Shang-Ping Xie from the University of California San Diego. “Warmer water in the Atlantic heats up the atmosphere. Rising air and increased precipitation drive a large atmospheric circulation cell, which then sinks over the Central Pacific. The relatively dry air feeds surface winds back into the Atlantic and the Indian Ocean. These winds cool the Central Pacific leading to conditions, which are similar to a La Niña Modoki event. The central Pacific cooling then strengthens the global atmospheric circulation anomalies.”

“Our results present a paradigm shift,” explains co-author Axel Timmermann, climate scientist and professor at the University of Hawaii. “Whereas the Pacific was previously considered the main driver of tropical climate variability and the Atlantic and Indian Ocean its slaves, our results document a much more active role for the Atlantic Ocean in determining conditions in the other two ocean basins. The coupling between the oceans is established by a massive reorganization of the atmospheric circulation.”

The impacts of the findings are wide-ranging. “Central Pacific temperature changes have a remote effect on rainfall in California and Australia. Seeing the Atlantic as an important contributor to these rainfall shifts, which happen as far away as Australia, came to us as a great surprise. It highlights the fact that on multi-year timescales we have to view climate variability in a global perspective, rather than through a basin-wide lens,” says Jing-Jia Luo, co-author of the study and climate scientist at the Bureau of Meteorology in Australia.

“Our study fills the gap between the well-established seasonal predictions and internationally ongoing decadal forecasting efforts. We anticipate that the main results will soon be corroborated by other climate computer models,” concludes co-author Masahide Kimoto from the University of Tokyo, Japan.

Journal Reference:

  1. Yoshimitsu Chikamoto, Axel Timmermann, Jing-Jia Luo, Takashi Mochizuki, Masahide Kimoto, Masahiro Watanabe, Masayoshi Ishii, Shang-Ping Xie, Fei-Fei Jin. Skilful multi-year predictions of tropical trans-basin climate variabilityNature Communications, 2015; 6: 6869 DOI: 10.1038/ncomms7869

Michael Lewis: Our Appetite for Apocalypse (Radio Open Source)

AUDIO

m lewis redoMichael Lewis is the non-fiction novelist of our apocalyptic American mindset in 2010. The heroes of The Big Short, as he puts it in conversation “were betting on the end of the world… The only characters you can really trust are the people who are delivering a very, very dark message.”

Michael Lewis, remember, was never really a sportswriter, despite MoneyballCoach and The Blind Side. Nor was he ever a finance guy, despite the prescience of Liar’s Poker and his sure touch now with the Wall Street collapse of 2007-2008. Michael Lewis’s real business and his genius instinct is for resonant social fables that just happen to play out on ballfields and bond markets.

The Big Short is a high literary feat, complete with a real-life “unreliable narrator,” a particularly despised contrarian bond dealer, Greg Lippmann, who was betting brazenly against his own market. “The guy selling the best ideas is a completely untrustworthy character,” the author remarks. The true center of The Big Short is an atmosphere of anxiety that has developed a taste for the catastrophic. Lewis’s short-selling characters resonate because they’re acting out our common sense of “the probability of extreme change” in financial markets and in real life. It’s an anxiety that envelops Tea Baggers and Greenpeaceniks in the same cloud of anger.

ML: The broader thing about all these characters to me is that their attitudes, their approach to life, their ability to hear the data, was something that was marginalized in the system itself. They didn’t belong, none of them belonged, and they should have belonged. What is it about the system that doesn’t want them as a part of it? And it’s terrifying when all the people who were wrong are in charge, and all the people who are right are on the outside.

CL: It sure is. To me there’s a direct analogy to be drawn with the war in Iraq. The Congress signed off “oh well, he must know something.” Tony Blair embraced it. The media by and large encouraged it. A very, very few people said “are you kidding?” And yet the ones that warned against the war in Iraq got the same prize that your guys got for warning of the meltdown.

ML: Yes. Ostracism.

CL: Exactly, and they’re still ostracized.

ML: It’s funny. There is an analogy. And the analogy is there’s a kind of a blind faith in leadership that is the result in both cases of ordinary people feeling they can’t evaluate the situation because it’s too complicated. The financial system got so complicated, and the complexity became opacity. When Alan Greenspan stands up and says something, no one understands what he’s saying. But they think that’s a good thing, because it’s all so complicated they shouldn’t understand what he’s saying. And the fact is they should. The fact is, if things aren’t being explained in a way you and I can understand them, it should be a bad sign, not a good sign. But the complexity was turned on its head. It was used as a way to mask bad things that were happening.

There’s a joke in it all. The joke is that the financial system, and there are analogies to the political system, but the financial system wanted to do something it really shouldn’t do. It wanted to make lots of loans that it shouldn’t make. They created all this risk that was going to blow up the system. In order to do that they needed to disguise the risk. So to disguise the risk it used all this complexity, which served as a smokescreen. And the joke is that it ended up disguising the risk from itself. That the very people who created the smokescreen were engulfed in it, and they couldn’t parse the system they created.

Michael Lewis with Chris Lydon in Boston, April 7, 2010.

Coping with the anthropocene: How we became nature (Science Daily)

Date: March 17, 2015

Source: De Gruyter

Summary: Overpopulation, the greenhouse effect, warming temperatures and overall climate disruption are all well recognized as a major threat to the ecology and biodiversity of the Earth.  The issue of humankind’s negative impact on the environment, albeit hotly debated and continuously present in the public eye, still only leads to limited policy action.


Overpopulation, the greenhouse effect, warming temperatures and overall climate disruption are all well recognized as a major threat to the ecology and biodiversity of the Earth. The issue of mankind’s negative impact on the environment, albeit hotly debated and continuously present in the public eye, still only leads to limited policy action. Urgent action is required, insist Paul Cruzten and Stanislaw Waclawek, the authors of “Atmospheric Chemistry and Climate in the Anthropocene,” published in open access in the new Chemistry-Didactics-Ecology-Metrology.

In their sobering review, Crutzen, the 1995 Nobel Laureate in Chemistry, and Waclawek, outline the development of a new geological epoch — the Anthropocene, where human actions become a global geophysical force, surpassing that of nature itself.

Anthropocene, which relates to the present geological epoch, in which human actions determine the behavior of the planet Earth to a greater degree than other natural processes. The term, coined by American ecologist Eugene F. Stoermer and popularized by Crutzen, introduced the epoch succeeding the Holocene, which is the official term for the present epoch on Geological Time Scale, covering the last 11, 500 years.

Although Anthropocene is not a new concept, it is only now that the authors present stunning evidence in support of their claim. The article describes the negative impact of the human footprint, which ensues a gradual destruction of the Earth. Highlighting different data elements — it yields overwhelming evidence that “man, the eroder” now transforms the atmospheric, geologic, hydrologic, biospheric, and other earth system processes.

The list is long and unforgiving:

· Excessively rapid climate change, so that ecosystems cannot adapt

· The Arctic ocean ice cover is thinner by approximately 40% compared to 20-40 years ago

· Ice loss and the growing sea levels

· Overpopulation (fourfold increase in the 20th century alone)

· Increasing demand for freshwater

· Releases of NO into the atmosphere, resulting in high surface ozone layers

· Loss of agricultural soil through erosion

· Loss of phosphorus. Dangerous depletion in agricultural regions

· Melting supplies of phosphate reserves (leading to serious reduction in crop yield)

Describing the negative impact of human activities on the environment, the authors identify planetary boundaries, as means to attaining global sustainability. It is “a well-documented summary of all humankind actions affecting the environment on all scales. According to Crutzen, we live in a new era, Anthropocene, and our survival fully depends on us. I strongly recommend this unusual publication in the form of highly informative compressed slides and graphs.” says Marina Frontasyeva from the Joint Institute for Nuclear Research in Dubna, Russia. Nature is us, and responding to the Anthropocene means building a culture that grows with the Earth’s biological wealth instead of depleting it.


Journal Reference:

  1. Paul J. Crutzen, Stanisław Wacławek. Atmospheric Chemistry and Climate in the AnthropoceneChemistry-Didactics-Ecology-Metrology, 2015 DOI: 10.1515/cdem-2014-0001

Geoengineering proposal may backfire: Ocean pipes ‘not cool,’ would end up warming climate (Science Daily)

Date: March 19, 2015

Source: Carnegie Institution

Summary: There are a variety of proposals that involve using vertical ocean pipes to move seawater to the surface from the depths in order to reap different potential climate benefits. One idea involves using ocean pipes to facilitate direct physical cooling of the surface ocean by replacing warm surface ocean waters with colder, deeper waters. New research shows that these pipes could actually increase global warming quite drastically.


To combat global climate change caused by greenhouse gases, alternative energy sources and other types of environmental recourse actions are needed. There are a variety of proposals that involve using vertical ocean pipes to move seawater to the surface from the depths in order to reap different potential climate benefits.A new study from a group of Carnegie scientists determines that these types of pipes could actually increase global warming quite drastically. It is published in Environmental Research Letters.

One proposed strategy–called Ocean Thermal Energy Conversion, or OTEC–involves using the temperature difference between deeper and shallower water to power a heat engine and produce clean electricity. A second proposal is to move carbon from the upper ocean down into the deep, where it wouldn’t interact with the atmosphere. Another idea, and the focus of this particular study, proposes that ocean pipes could facilitate direct physical cooling of the surface ocean by replacing warm surface ocean waters with colder, deeper waters.

“Our prediction going into the study was that vertical ocean pipes would effectively cool the Earth and remain effective for many centuries,” said Ken Caldeira, one of the three co-authors.

The team, which also included lead author Lester Kwiatkowski as well as Katharine Ricke, configured a model to test this idea and what they found surprised them. The model mimicked the ocean-water movement of ocean pipes if they were applied globally reaching to a depth of about a kilometer (just over half a mile). The model simulated the motion created by an idealized version of ocean pipes, not specific pipes. As such the model does not include real spacing of pipes, nor does it calculate how much energy they would require.

Their simulations showed that while global temperatures could be cooled by ocean pipe systems in the short term, warming would actually start to increase just 50 years after the pipes go into use. Their model showed that vertical movement of ocean water resulted in a decrease of clouds over the ocean and a loss of sea-ice.

Colder air is denser than warm air. Because of this, the air over the ocean surface that has been cooled by water from the depths has a higher atmospheric pressure than the air over land. The cool air over the ocean sinks downward reducing cloud formation over the ocean. Since more of the planet is covered with water than land, this would result in less cloud cover overall, which means that more of the Sun’s rays are absorbed by Earth, rather than being reflected back into space by clouds.

Water mixing caused by ocean pipes would also bring sea ice into contact with warmer waters, resulting in melting. What’s more, this would further decrease the reflection of the Sun’s radiation, which bounces off ice as well as clouds.

After 60 years, the pipes would cause an increase in global temperature of up to 1.2 degrees Celsius (2.2degrees Fahrenheit). Over several centuries, the pipes put the Earth on a warming trend towards a temperature increase of 8.5 degrees Celsius (15.3 degrees Fahrenheit).

“I cannot envisage any scenario in which a large scale global implementation of ocean pipes would be advisable,” Kwiatkowski said. “In fact, our study shows it could exacerbate long-term warming and is therefore highly inadvisable at global scales.”

The authors do say, however, that ocean pipes might be useful on a small scale to help aerate ocean dead zones.


Journal Reference:

  1. Lester Kwiatkowski, Katharine L Ricke and Ken Caldeira. Atmospheric consequences of disruption of the ocean thermoclineEnvironmental Research Letters, 2015 DOI: 10.1088/1748-9326/10/3/034016

Welcome to Global Warming’s Terrifying New Era (Slate)

By Eric Holthaus

19 March 2015

466467728-in-this-handout-image-provided-by-unicef-the-storm

Storm damage in Port Vila, Vanuatu. Photo by UNICEF via Getty Images

On Wednesday, the National Oceanic and Atmospheric Administration announcedthat Earth’s global temperature for February was among the hottest ever measured. So far, 2015 is tracking above record-warm 2014—which, when combined with the newly resurgent El Niño, means we’re on pace for another hottest year in history.

In addition to the just-completed warmest winter on record globally (despite the brutal cold and record snow in the eastern U.S.), new data on Thursday from the National Snow and Ice Data Center show that this year’s peak Arctic sea ice reached its lowest ever maximum extent, thanks to “an unusual configuration of the jet stream” that greatly warmed the Pacific Ocean near Alaska.

But here’s the most upsetting news. It’s been exactly 30 years since the last time the world was briefly cooler than its 20th-century average. Every single month since February 1985 has been hotter than the long-term average—that’s 360 consecutive months.

More than just being a round number, the 30-year streak has deeper significance. In climatology, a continuous 30-year stretch of data is traditionally what’s used to define what’s “normal” for a given location. In a very real way, we can now say that for our given location—the planet Earth—global warming is now “normal.” Forget debating—our climate has officially changed.

This 30-year streak should change the way we think and talk about this issue. We’ve entered a new era in which global warming is a defining characteristic and a fundamental driver of what it means to be an inhabitant of planet Earth. We should treat it that way. For those who care about the climate, that may mean de-emphasizing statistics and science and beginning to talk more confidently about the moral implications of continuing on our current path.

Since disasters disproportionately impact the poor, climate change is increasingly an important economic and social justice issue. The pope will visit the United States later this year as part of a broader campaign by the Vatican to directly influence the outcome of this year’s global climate negotiations in Paris—recent polling data show his message may be resonating, especially with political conservatives and nonscience types. Two-thirds of Americans now believe that world leaders are morally obligated to take steps to reduce carbon.

Scientists and journalists have debated the connection between extreme weather and global warming for years, but what’s happening now is different. Since weather impacts virtually every facet of our lives (at least in a small way), and since climate change is affecting weather at every point in the globe every day (at least in a small way), that makes it at the same time incredibly difficult to study and incredibly important. Formal attribution studies that attempt to scientifically tease out whether global warming “caused” individual events are shortsighted and miss the point. It’s time for a change in tack. The better question to ask is: How do we as a civilization collectively tackle the weather extremes we already face?

In the aftermath of the nearly unprecedented power and destructive force of Cyclone Pam’s landfall in the remote Pacific island nation of Vanuatu—where survivors were forced to drink saltwater—emerges perhaps the best recent example I’ve seen of a government acknowledging this changed climate in a scientifically sound way:

Cyclone Pam is a consequence of climate change since all weather is affected by the planet’s now considerably warmer climate. The spate of extreme storms over the past decade—of which Pam is the latest—is entirely consistent in science with the hottest ever decade on record.

The statement was from the government of the Philippines, the previous country to suffer a direct strike by a Category 5 cyclone—Haiyan in 2013. As chair of the Climate Vulnerable Forum negotiating bloc, the Philippines also called for a strengthening of ambition in the run-up to this year’s global climate agreement in Paris.

The cost of disasters of all types is rising around the globe as population and wealth increase and storms become more fierce. This week in Japan, 187 countries agreed on a comprehensive plan to reduce loss of life from disasters as well as their financial impact. However, the disaster deal is nonbinding and won’t provide support to the most vulnerable countries.

Combining weather statistics and photos of devastated tropical islands with discussions of political and economic winners and losers is increasingly necessary as climate change enters a new era. We’re no longer describing the problem. We’re telling the story of how humanity reacts to this new normal.

As the Guardian’s Alan Rusbridger, in an editorial kickoff of his newspaper’s newly heightened focus on climate, said, “the mainstream argument has moved on.” What’s coming next isn’t certain, but it’s likely to be much more visceral and real than steadily upward sloping lines on a graph.

Sabesp inicia obras às pressas sem avaliar risco (OESP)

Fabio Leite – O Estado de S. Paulo

15 Março 2015 | 02h 01

Companhia de Saneamento Básico do Estado de São Paulo desengavetou planos sem ter tempo de estudar impacto ambiental

SÃO PAULO – A busca por novos mananciais para suprir a escassez hídrica a curto prazo e tentar evitar o rodízio oficial de água na Grande São Paulo levou a Companhia de Saneamento Básico do Estado de São Paulo (Sabesp) a tirar do papel uma série de projetos engavetados há anos e a executá-los a toque de caixa sem Estudo de Impacto Ambiental (EIA), aprovação em comitês ou decreto de estado de emergência.

Até o momento, são seis obras (uma já concluída) que envolvem transposições entre rios e reservatórios com o objetivo de aumentar a oferta de água para conseguir abastecer 20 milhões de pessoas durante o período seco (que vai de abril a setembro) sem decretar racionamento generalizado. A principal delas é a interligação do Sistema Rio Grande com o Alto Tietê, o segundo manancial mais crítico (21% da capacidade), melhor só que o Cantareira.

Segundo a Sabesp, já foi iniciada a construção de 11 quilômetros de adutora e uma estação de bombeamento para levar até 4 mil litros por segundo da Billings, no ABC, para a Represa Taiaçupeba, em Suzano. A conclusão está prevista para julho. Técnicos do governo Geraldo Alckmin (PSDB) afirmam, contudo, que uma obra desse porte precisaria de EIA, aprovação no Comitê da Bacia do Alto Tietê, além da outorga do Departamento de Águas e Energia Elétrica de São Paulo (DAEE).

A principal das obras é a interligação do Sistema Rio Grande com o Alto Tietê, o segundo manancial mais crítico (21% da capacidade), melhor só que o Cantareira.

A principal das obras é a interligação do Sistema Rio Grande com o Alto Tietê, o segundo manancial mais crítico (21% da capacidade), melhor só que o Cantareira.

Com a provável reversão das águas do poluído corpo central da Billings para o Braço Rio Grande, já manifestada pela Sabesp, seria preciso ainda aprovação prévia do Conselho Estadual do Meio Ambiente (Consema) e de outorga da Agência Nacional de Energia Elétrica (Aneel), já que a represa também fornece água para geração de energia na Usina Henry Borden, em Cubatão. Todo esse trâmite teve de ser seguido para a execução da ligação Billings-Guarapiranga, pelo Braço Taquacetuba, na crise de 2000.

“Ou o governo decreta estado de emergência para tocar as chamadas obras emergenciais sem licitação e estudo de impacto ambiental, com perda de capacidade de concorrência e de participação social, ou então licita e produz os relatórios necessários. Do jeito que está, há uma incoerência brutal”, afirmou o engenheiro Darcy Brega Filho, especialista em gestão de sustentabilidade e ex-funcionário da Sabesp.

Mar. No pacote de obras emergenciais estão a interligação de dois rios de vertente marítima (que deságuam no mar), Itatinga e Capivari, para rios que são afluentes das Represas Jundiaí (Alto Tietê) e Guarapiranga. As duas intervenções recém-anunciadas pela Sabesp já constavam do Plano Diretor de Águas e Abastecimento (PDAA) de 2004 e ficaram engavetadas. Cada uma deve aumentar a vazão dos sistemas em 1 mil litros por segundo e também precisariam de aprovação do Comitê da Bacia da Baixada Santista.

“Sem dúvida, é preciso de obras emergenciais para trazer água para a região metropolitana, mas isso não anula uma avaliação mais acurada desse conjunto de transposições para calcular a eficiência desses projetos e seus efeitos indiretos”, afirmou o especialista em recursos hídricos José Galizia Tundisi, presidente do Instituto Internacional de Ecologia e vice-presidente do Instituto Acqua.

Um exemplo citado por funcionários do governo sobre a falta de avaliação dos projetos é a construção de 9 quilômetros de adutora para levar 1 mil litros por segundo do Rio Guaió para a Represa Taiaçupeba. As obras começaram em fevereiro e devem ser concluídas em maio, segundo a Sabesp. Técnicos da área afirmam que durante o período de estiagem a vazão média desse rio é de apenas 300 litros por segundo, ou seja, 70% menor do que a pretendida.

How Silicon Valley controls our future (Fear and the Technopanic)

Translated: THE WORLD GOVERNMENT
How Silicon Valley controls our future

Jeff Jarvis

Oh, My!

Just 12 hours ago, I posted a brief piece about the continuing Europtechnopanic in Germany and the effort of publishers there to blame their every trouble on Google—even the so-called sin of free content and the price of metaphoric wurst.

Now Germany one-ups even itself with the most amazing specimen of Europtechnopanic I have yet seen. The cover of Der Spiegel, the country’s most important news outlet, makes the titans of Silicon Valley look dark, wicked, and, well—I just don’t know how else to say it—all too much like this.

This must be Spiegel’s Dystopian Special Issue. Note the additional cover billing: “Michel Houellebecq: ‘Humanism and enlightenment are dead.’”

I bought the issue online—you’re welcome—so you can read along with me (and correct my translations, please).

The cover story gets right to the point. Inside, the opening headline warns: “Tomorrowland: In Silicon Valley, a new elite doesn’t just want to determine what we consume but how we live. They want to change the world and accept no regulation. Must we stop them?”

Ah, yes, German publishers want to regulate Google—and now, watch out, Facebook, Apple, Uber, and Yahoo! (Yahoo?), they’re gunning for you next.

Turn the page and the first thing you read is this: “By all accounts, Travis Kalanick, founder and head of Uber, is an asshole.”

Oh, my.

It continues: “Uber is not the only company with plans for such world conquest. That’s how they all think: Google and Facebook, Apple and Airbnb, all those digital giants and thousands of smaller firms in their neighborhood. Their goal is never the niche but always the whole world. They don’t follow delusional fantasies but have thoroughly realistic goals in sight. It’s all made possible by a Dynamic Duo almost unique in economic history: globalization coupled with digitilization.”

Digitalization, you see, is not just a spectre haunting Europe but a dark force overcoming the world. Must it be stopped? We’re merely asking.

Spiegel’s editors next fret that “progress will be faster and bigger, like an avalanche:” iPhone, self-driving cars, the world’s knowledge now digital and retrievable, 70% of stock trading controlled by algorithms, commercial drones, artificial intelligence, robots. “Madness but everyday madness,” Spiegel cries. “No longer science fiction.”

What all this means is misunderstood, Spiegel says, “above all by politicians,” who must decide whether to stand by as spectators while “others organize a global revolution. Because what is happening is much more than the triumph of new technology, much more than an economic phenomenon. It’s not just about ‘the internet’ or ‘social networks,’ not about intelligence and Edward Snowden and the question of what Google does with data.” It’s not just about newspapers shutting down and jobs lost to software. We are in the path of social change, “which in the end no one can escape.” Distinct from the industrial revolution, this time “digitization doesn’t just change industries but how we think and how we live. Only this time the change is controlled centrally by a few hundred people…. They aren’t stumbling into the future, they are ideologues with a clear agenda…. a high-tech doctrine of salvation.”

Nerdnazis.

Oh, fuck!

The article then takes us on a tour of our new world capital, home to our “new Masters of the Universe,” who—perversely, apparently—are not concerned primarily about money. “Power through money isn’t enough for them.” It examines the roots of their philosophy from the “tradition of radical thinkers such as Noam Chomsky, Ayn Rand, and Friedrich Hayek,” leading to a “strange mixture of esoteric hippie-thinking and bare-knuckled capitalism.” Spiegel calls it their Menschheitsbeglückungswerks. I had to ask Twitter WTF that means.

Aha. So must we just go along with having this damned happiness shoved down our throats? “Is now the time for regulation before the world is finally dominated by digital monopolies?” Spiegel demands — I mean, merely asks? “Is this the time for democratic societies to defend themselves?”

Spiegel then visits four Silicon Valley geniuses: singularity man Ray Kurzweil; the conveniently German Sebastian Thrun, he of the self-driving car and online university; the always-good-for-a-WTF Peter Thiel (who was born in Germany but moved away after a year); and Airbnb’s Joe Gebbia. It recounts German President Joachim Gauck telling Thrun, “you scare me.” And it allows Thrun to respond that it’s the optimists, not the naysayers, who change the world.

I feared that these hapless four would be presented as ugly caricatures of the frightening, alien tribe of dark-bearded technopeople. You know what I’m getting at. But I’m relieved to say that’s not the case. What follows all the fear-mongering bluster of the cover story’s start is actual reporting. That is to say, a newsmagazine did what a newsmagazine does: It tops off its journalism with its agenda: frosting on the cupcake. And the agenda here is that of German publishers—some of them, which I explored last night and earlier. They attack Google and enlist politicians to do their bidding with new regulations to disadvantage their big, new, American, technological competitors.

And you know what? The German publishers’ strategy is working. German lawmakers passed a new ancillary copyright (nevermind that Google won that round when publishers gave it permission to quote their snippets) and EU politicians are talking not just about creating new copyright and privacy law but even about breaking up Google. The publishers are bringing Google to heel. The company waited far too long to empathize with publishers’ plight—albeit self-induced—and to recognize their political clout (a dangerous combination: desperation and power, as Google now knows). Now see how Matt Brittin, the head of EMEA for Google, drops birds at Europe’s feet like a willing hund, showing all the good that Google does indeed bring them.

I have also noted that Google is working on initiatives with European publishers to find mutual benefit and I celebrate that. That is why—ever helpful as I am—I wrote this post about what Google could do for news and this one about what news could do for Google. I see real opportunity for enlightened self-interest to take hold both inside Google and among publishers and for innovation and investment to come to news. But I’m one of those silly and apparently dangerous American optimists.

As I’ve often said, the publishers—led by Mathias Döpfner of Axel Springer and Paul-Bernhard Kallen of Burda—are smart. I admire them both. They know what they’re doing, using the power of their presses and thus their political clout to box in even big, powerful Google. It’s a game to them. It’s negotiation. It’s just business. I don’t agree with or much like their message or the tactic. But I get it.

Then comes this Scheißebombe from Der Spiegel. It goes far beyond the publishers’ game. It is nothing less than prewar propaganda, trying to stir up a populace against a boogeyman enemy in hopes of goading politicians to action to stop these people. If anyone would know better, you’d think they would. Schade.

Após forte chuva, sistema Cantareira sobe de novo e chega a 11,1% (Folha de S.Paulo)

DE SÃO PAULO

26/02/2015  09h44

A forte chuva que atingiu a Grande SP na tarde de quarta-feira (25) fez com que o nível do Cantareira aumentasse 0,3 ponto percentual em comparação com o dia anterior. O manancial opera agora com 11,1%, índice ainda considerado crítico.

Apesar de ter subido mais do que nos dias anteriores, o aumento não foi tão alto porque, segundo meteorologistas, a chuva forte que atingiu São Paulo não passou pela região do manancial. Lá, a chuva foi mais moderada.

Representantes da Sabesp, no entanto, reiteraram na quarta-feira em sessão na Câmara Municipal de São Paulo que não está previsto um rodízio de água para a Grande SP. mesmo com as chuvas abaixo do previsto para março.

Fevereiro, segundo dados divulgados pela Sabesp, vai fechar com chuvas bem acima da média histórica. Até agora, no Cantareira registrou 293 mm de chuva quando a média para o mês é de 199,1 mm.

O Cantareira abastece 6,2 milhões de pessoas na zona norte e partes das zonas leste, oeste, central e sul da capital paulista -eram cerca de 9 milhões antes da crise. Essa diferença passou a ser atendida por outros sistemas.

Desde julho de 2014, em meio à grave crise hídrica, o governo paulista utilizou duas reservas do fundo da represa, conhecidas como volume morto. Esse volume, no Cantareira espalhado em três diferentes represas, é a porção que fica abaixo das tubulações que captam água. E, para ser utilizada, precisa ser bombeada.

A segunda cota do volume morto, de 105 bilhões de litros, começou a ser usada em novembro. Nesta terça, quando o sistema atingiu 10,7% de sua capacidade, o equivalente a ela foi recuperado. Já a primeira cota do volume morto, de 182,5 bilhões, talvez somente possa ser recuperada em um ou dois anos. Isso ocorrerá quando o nível do manancial atingir 29,2%.

A utilização do volume morto, segundo especialistas, pode ser comparada ao uso do cheque especial. Ambientalistas também apontam alguns riscos, como o de extinção de uma reserva técnica do manancial, por exemplo.

Rubens Fernando Alencar e Pilker/Folhapress

OUTROS RESERVATÓRIOS

Já o nível do reservatório Alto Tietê, que também sofre as consequências da seca, opera com 18,3% de sua capacidade, o mesmo índice registrado há quatro dias.

O sistema abastece 4,5 milhões de pessoas na região leste da capital paulista e Grande São Paulo. No dia 14 de dezembro, o Alto Tietê passou a contar com a adição do volume morto , que gerou um volume adicional de 39,5 milhões de metros cúbicos de água da represa Ponte Nova, em Salesópolis (a 97 km de São Paulo).

O nível da represa de Guarapiranga, que fornece água para 5,2 milhões de pessoas nas zonas sul e sudeste da capital paulista, avançou 1,1 ponto percentual e opera com 59,8% de sua capacidade.

O reservatório Rio Grande, que atendem a 1,5 milhão de pessoas, caiu 0,1 ponto percentual e opera agora com 83,3%. Já o reservatório Rio Claro, que também atende 1,5 milhão de pessoas, avançou 0,2 percentual. O sistema opera com 35,7%

O sistema Alto Cotia também teve melhora passando de 36,4% para 37,7%. O reservatório fornece água para 400 mil pessoas.

A medição da Sabesp é feita diariamente e compreende um período de 24 horas: das 7h às 7h.

Seca no sistema Cantareira

Nacho Doce – 4.dez.14/Reuters

Em 10 anos, falta d’água atingirá 2,9 bilhões (Estadão)

Os países mais deficitários serão os com menos recursos e populações jovens e em crescimento

Um relatório internacional divulgado ontem adverte que, em 15 anos, a demanda mundial por água doce será 40% superior à oferta. Os países mais deficitários serão os com menos recursos e populações jovens e em crescimento. O documento do Instituto de Água, Meio Ambiente e Saúde (INWEH) da Universidade das Nações Unidas com sede no Canadá, prevê que em 10 anos 48 países – e uma população de 2,9 bilhões de pessoas – estarão classifica dos como “com escassez ou com estresse de água”.

O conteúdo na íntegra está disponível em: http://digital.estadao.com.br/download/pdf/2015/02/25/A15.pdf

(O Estado de S.Paulo)

Cantareira recupera 2º volume morto (Estadão)

Sistema Cantareira atingiu ontem 10,7% da capacidade

Após a 19.ª alta consecutiva, o nível do Sistema Cantareira atingiu ontem 10,7% da capacidade, índice que marca a “recuperação” da segunda cota do volume morto, de acordo com a Companhia de Saneamento Básico do Estado de São Paulo (Sabesp) e a Agência Nacional de Águas (ANA).

O conteúdo na íntegra está disponível em: http://digital.estadao.com.br/download/pdf/2015/02/25/A15.pdf

(O Estado de S.Paulo)

Mais informações sobre o assunto na Folha de S.Paulo – http://www1.folha.uol.com.br/fsp/cotidiano/209589-cantareira-recupera-2-parte-do-volume-morto.shtml

When Exponential Progress Becomes Reality (Medium)

Niv Dror

“I used to say that this is the most important graph in all the technology business. I’m now of the opinion that this is the most important graph ever graphed.”

Steve Jurvetson

Moore’s Law

The expectation that your iPhone keeps getting thinner and faster every two years. Happy 50th anniversary.

Components get cheapercomputers get smallera lot of comparisontweets.

In 1965 Intel co-founder Gordon Moore made his original observation, noticing that over the history of computing hardware, the number of transistors in a dense integrated circuit doubles approximately every two years. The prediction was specific to semiconductors and stretched out for a decade. Its demise has long been predicted, and eventually will come to an end, but continues to be valid to this day.

Expanding beyond semiconductors, and reshaping all kinds of businesses, including those not traditionally thought of as tech.

Yes, Box co-founder Aaron Levie is the official spokesperson for Moore’s Law, and we’re all perfectly okay with that. His cloud computing company would not be around without it. He’s grateful. We’re all grateful. In conversations Moore’s Law constantly gets referenced.

It has become both a prediction and an abstraction.

Expanding far beyond its origin as a transistor-centric metric.

But Moore’s Law of integrated circuits is only the most recent paradigm in a much longer and even more profound technological trend.

Humanity’s capacity to compute has been compounding for as long as we could measure it.

5 Computing Paradigms: Electromechanical computer build by IBM for the 1890 U.S. Census → Alan Turing’s relay based computer that cracked the Nazi Enigma → Vacuum-tube computer predicted Eisenhower’s win in 1952 → Transistor-based machines used in the first space launches → Integrated-circuit-based personal computer

The Law of Accelerating Returns

In his 1999 book The Age of Spiritual Machines Google’s Director of Engineering, futurist, and author Ray Kurzweil proposed “The Law of Accelerating Returns”, according to which the rate of change in a wide variety of evolutionary systems tends to increase exponentially. A specific paradigm, a method or approach to solving a problem (e.g., shrinking transistors on an integrated circuit as an approach to making more powerful computers) provides exponential growth until the paradigm exhausts its potential. When this happens, a paradigm shift, a fundamental change in the technological approach occurs, enabling the exponential growth to continue.

Kurzweil explains:

It is important to note that Moore’s Law of Integrated Circuits was not the first, but the fifth paradigm to provide accelerating price-performance. Computing devices have been consistently multiplying in power (per unit of time) from the mechanical calculating devices used in the 1890 U.S. Census, to Turing’s relay-based machine that cracked the Nazi enigma code, to the vacuum tube computer that predicted Eisenhower’s win in 1952, to the transistor-based machines used in the first space launches, to the integrated-circuit-based personal computer.

This graph, which venture capitalist Steve Jurvetson describes as the most important concept ever to be graphed, is Kurzweil’s 110 year version of Moore’s Law. It spans across five paradigm shifts that have contributed to the exponential growth in computing.

Each dot represents the best computational price-performance device of the day, and when plotted on a logarithmic scale, they fit on the same double exponential curve that spans over a century. This is a very long lasting and predictable trend. It enables us to plan for a time beyond Moore’s Law, without knowing the specifics of the paradigm shift that’s ahead. The next paradigm will advance our ability to compute to such a massive scale, it will be beyond our current ability to comprehend.

The Power of Exponential Growth

Human perception is linear, technological progress is exponential. Our brains are hardwired to have linear expectations because that has always been the case. Technology today progresses so fast that the past no longer looks like the present, and the present is nowhere near the future ahead. Then seemingly out of nowhere, we find ourselves in a reality quite different than what we would expect.

Kurzweil uses the overall growth of the internet as an example. The bottom chart being linear, which makes the internet growth seem sudden and unexpected, whereas the the top chart with the same data graphed on a logarithmic scale tell a very predictable story. On the exponential graph internet growth doesn’t come out of nowhere; it’s just presented in a way that is more intuitive for us to comprehend.

We are still prone to underestimate the progress that is coming because it’s difficult to internalize this reality that we’re living in a world of exponential technological change. It is a fairly recent development. And it’s important to get an understanding for the massive scale of advancements that the technologies of the future will enable. Particularly now, as we’ve reachedwhat Kurzweil calls the “Second Half of the Chessboard.”

(In the end the emperor realizes that he’s been tricked, by exponents, and has the inventor beheaded. In another version of the story the inventor becomes the new emperor).

It’s important to note that as the emperor and inventor went through the first half of the chessboard things were fairly uneventful. The inventor was first given spoonfuls of rice, then bowls of rice, then barrels, and by the end of the first half of the chess board the inventor had accumulated one large field’s worth — 4 billion grains — which is when the emperor started to take notice. It was only as they progressed through the second half of the chessboard that the situation quickly deteriorated.

# of Grains on 1st half: 4,294,967,295

# of Grains on 2nd half: 18,446,744,069,414,600,000

Mind-bending nonlinear gains in computing are about to get a lot more realistic in our lifetime, as there have been slightly more than 32 doublings of performance since the first programmable computers were invented.

Kurzweil’s Predictions

Kurzweil is known for making mind-boggling predictions about the future. And his track record is pretty good.

“…Ray is the best person I know at predicting the future of artificial intelligence.” —Bill Gates

Ray’s prediction for the future may sound crazy (they do sound crazy), but it’s important to note that it’s not about the specific prediction or the exact year. What’s important to focus on is what the they represent. These predictions are based on an understanding of Moore’s Law and Ray’s Law of Accelerating Returns, an awareness for the power of exponential growth, and an appreciation that information technology follows an exponential trend. They may sound crazy, but they are not based out of thin air.

And with that being said…

Second Half of the Chessboard Predictions

“By the 2020s, most diseases will go away as nanobots become smarter than current medical technology. Normal human eating can be replaced by nanosystems. The Turing test begins to be passable. Self-driving cars begin to take over the roads, and people won’t be allowed to drive on highways.”

“By the 2030s, virtual reality will begin to feel 100% real. We will be able to upload our mind/consciousness by the end of the decade.”

To expand image → https://twitter.com/nivo0o0/status/564309273480409088

Not quite there yet…

“By the 2040s, non-biological intelligence will be a billion times more capable than biological intelligence (a.k.a. us). Nanotech foglets will be able to make food out of thin air and create any object in physical world at a whim.”

These clones are cute.

“By 2045, we will multiply our intelligence a billionfold by linking wirelessly from our neocortex to a synthetic neocortex in the cloud.”

Multiplying our intelligence a billionfold by linking our neocortex to a synthetic neocortex in the cloud — what does that actually mean?

In March 2014 Kurzweil gave an excellent talk at the TED Conference. It was appropriately called: Get ready for hybrid thinking.

Here is a summary:

To expand image → https://twitter.com/nivo0o0/status/568686671983570944

These are the highlights:

Nanobots will connect our neocortex to a synthetic neocortex in the cloud, providing an extension of our neocortex.

Our thinking then will be a hybrid of biological and non-biological thinking(the non-biological portion is subject to the Law of Accelerating Returns and it will grow exponentially).

The frontal cortex and neocortex are not really qualitatively different, so it’s a quantitative expansion of the neocortex (like adding processing power).

The last time we expanded our neocortex was about two million years ago. That additional quantity of thinking was the enabling factor for us to take aqualitative leap and advance language, science, art, technology, etc.

We’re going to again expand our neocortex, only this time it won’t be limited by a fixed architecture of inclosure. It will be expanded without limits, by connecting our brain directly to the cloud.

We already carry a supercomputer in our pocket. We have unlimited access to all the world’s knowledge at our fingertips. Keeping in mind that we are prone to underestimate technological advancements (and that 2045 is not a hard deadline) is it really that far of a stretch to imagine a future where we’re always connected directly from our brain?

Progress is underway. We’ll be able to reverse engineering the neural cortex within five years. Kurzweil predicts that by 2030 we’ll be able to reverse engineer the entire brain. His latest book is called How to Create a Mind… This is the reason Google hired Kurzweil.

Hybrid Human Machines

To expand image → https://twitter.com/nivo0o0/status/568686671983570944

“We’re going to become increasingly non-biological…”

“We’ll also have non-biological bodies…”

“If the biological part went away it wouldn’t make any difference…”

They* will be as realistic as real reality.”

Impact on Society

technological singularity —“the hypothesis that accelerating progress in technologies will cause a runaway effect wherein artificial intelligence will exceed human intellectual capacity and control, thus radically changing civilization” — is beyond the scope of this article, but these advancements will absolutely have an impact on society. Which way is yet to be determined.

There may be some regret

Politicians will not know who/what to regulate.

Evolution may take an unexpected twist.

The rich-poor gap will expand.

The unimaginable will become reality and society will change.

Previsão do tempo no Sudeste é uma dor de cabeça para cientistas (Folha de S.Paulo)

15/02/2015

Peculiaridades do clima regional tornam difícil saber como ficará o nível do Cantareira mesmo no curto prazo

Área está sujeita à influência de vários fatores complexos, como umidade da Amazônia e frentes frias da Antártida

REINALDO JOSÉ LOPES

COLABORAÇÃO PARA A FOLHA

LUCAS VETTORAZZO

DO RIO

Se a sucessão de boas e más notícias sobre a chuva que abastece os reservatórios de São Paulo parece uma confusão só, não se preocupe: previsões climáticas sobre o Sudeste brasileiro podem confundir até especialistas.

Isso acontece porque a região mais populosa do Brasil ocupa uma área do globo terrestre que recebe todo tipo de influência complexa, desde a umidade oriunda da Amazônia até as frentes frias “sopradas” da Antártida.

Resultado: um nível de incerteza acima do normal numa seara que, por natureza, já é bastante incerta.

“Isso vale principalmente para prever o clima, ou seja, as variações de médio e longo prazo, mas também é verdade, ainda que em grau bem menor, para as previsões de tempo, ou seja, na escala de dias”, diz Tercio Ambrizzi, climatologista da USP.

Portanto, não é que o tempo seja mais instável na área do sistema Cantareira, o mais castigado pela atual crise e agora em ligeira recuperação. O que ocorre é que a região que abastece o Cantareira às vezes pode ficar mais sujeita a variações aleatórias de um sistema climático naturalmente complicado.

TEORIA DO CAOS

“Em escalas maiores do que 15 dias, faz décadas que está comprovado que o clima é caótico”, diz Gilvan Sampaio de Oliveira, meteorologista do Inpe (Instituto Nacional de Pesquisas Espaciais).

“Aliás, foi a partir daí que surgiu a teoria do caos”, afirma ele, referindo-se à ideia de que, em certos fenômenos complexos, pequenas mudanças no começo podem levar a alterações muito maiores e imprevisíveis no fim.

Em regiões tropicais, como é o caso de quase todo o território do Brasil, isso é ainda mais verdadeiro, porque o calor injeta mais energia na atmosfera, fazendo com que alterações do tempo aconteçam com mais velocidade e imprevisibilidade.

Além do calor, porém, o Sudeste também tem a desvantagem de que as variações climáticas por aqui dependem de fatores não oceânicos.

“Quando o clima de uma região depende do oceano, é bem mais fácil prevê-lo porque as variações oceânicas acontecem de forma bem mais lenta do que as da atmosfera”, explica Oliveira. “É o caso do semiárido nordestino, ligado basicamente às condições do oceano Pacífico e do Atlântico tropical. Se é ano de El Niño, com o Pacífico mais aquecido, a tendência é seca no Nordeste.”

Já as chuvas do Sudeste, em especial as de verão, estão ligadas principalmente à ZCAS (Zona de Convergência do Atlântico Sul), formada pela umidade da Amazônia, que se espalha numa grande faixa que atravessa o Brasil Central, e pelas frentes frias antárticas (veja infográfico).

“Quando essa zona se fortalece você pode ter chuva constante por três, quatro, cinco dias, e é bem comum isso acontecer no Carnaval, como inclusive deve acontecer neste ano”, diz Oliveira.

Em 2014 e, em menor grau, também neste ano, contudo, a ZCAS não atuou como deveria, com um bloqueio atmosférico impedindo que as chuvas de verão atingissem o Sudeste (e o Cantareira) em cheio. As chuvas constantes e bem distribuídas voltaram apenas nas últimas semanas, porque a ZCAS parece ter se “ajeitado” de novo.

Mesmo nesse cenário, isso não significa que as chuvas de verão cessem totalmente. Com o calor típico da estação, há um ciclo rápido de evaporação e chuva –mas é um padrão local, o que explica tempestades localizadas e inundações na Grande São Paulo, sem que essas precipitações façam cócegas no Cantareira.

Há ainda outro agravante, que talvez ajude a entender a fama de imprevisível da área. Até pouco tempo atrás, não havia estações pluviométricas confiáveis para medir o volume de chuva na região do Cantareira, conta José Marengo, climatologista do Cemaden (Centro Nacional de Monitoramento e Alertas de Desastres Naturais).

“Os pluviômetros mais próximos eram os de Campos do Jordão. Faltam registros históricos. Não podemos intercalar com os dados de Campos do Jordão porque é outro regime chuva.”