Pages: 1 ... 77 78 [79] 80 81 ... 88   Go Down
Print
Author Topic: Pluto in Cap, the climate, ecology and environment topic  (Read 73727 times)
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1170 on: Aug 21, 2014, 05:14 AM »

39 kilotons a year: Mysterious source of ozone-depleting chemical banned since 2009 baffles NASA

By Agence France-Presse
Thursday, August 21, 2014 5:55 EDT

A chemical used in dry cleaning and fire extinguishers may have been phased out in recent years but NASA said Wednesday that carbon tetrachloride (CCl4) is still being spewed into the atmosphere from an unknown source.

The world agreed to stop using CC14 as part of the Vienna Convention on Protection of the Ozone Layer and its Montreal Protocol, which attained universal ratification in 2009.

“Parties to the Montreal Protocol reported zero new CCl4 emissions between 2007-2012,” the US space agency said in a statement.

“However, the new research shows worldwide emissions of CCl4 average 39 kilotons per year, approximately 30 percent of peak emissions prior to the international treaty going into effect.”

CC14 levels are not enough to reverse the decreasing trend of ozone-depletion, but experts are still mystified as to where it is coming from.

With no new reported emissions, atmospheric concentrations of the compound should have declined at an expected rate of four percent per year since 2007.

However, observations from the ground showed atmospheric concentrations were only declining one percent per year.

“We are not supposed to be seeing this at all,” said Qing Liang, an atmospheric scientist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland.

“It is now apparent there are either unidentified industrial leakages, large emissions from contaminated sites, or unknown CCl4 sources.”

Researchers used NASA’s 3-D GEOS Chemistry Climate Model and data from global networks of ground-based observations to establish the first estimate of average global CC14 emissions from 2000 to 2012.

In going through the data, researchers also learned that the chemical stays in the atmosphere were 40 percent longer than previously thought.

“People believe the emissions of ozone-depleting substances have stopped because of the Montreal Protocol,” said Paul Newman, chief scientist for atmospheres at NASA.

“Unfortunately, there is still a major source of CCl4 out in the world.”

The study was published in the journal Geophysical Research Letters.

Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1171 on: Aug 21, 2014, 07:19 AM »

Ancient scourge? Myanmar still sees 3,000 new leprosy cases a year

By Agence France-Presse
Thursday, August 21, 2014 5:22 EDT

High in the hills of Myanmar’s war-torn borderlands, a clutch of new leprosy cases among communities virtually cut off from medical help is a sign that the country’s battle with the ancient disease is far from over.

It took six days by plane, boat, motorcycle, bus — and an arduous mountain trek — for a group of medical workers to treat two leprosy patients in a remote corner of the country, where conflict and neglect are the legacy of decades of military rule and even access to basic medicines is a distant dream.

But the charity-funded medics were also on the lookout for evidence that the disease had spread.

They soon found three more leprosy sufferers, including one man who had such a severe case he required hospital care.

“I promised him that I would come back for him or I would send someone to pick him up,” said Doctor Saw Hsar Mu Lar, after the May expedition, as he returned to his hospital in Mawlamyaing, Mon state — one of only two specialising in leprosy in Myanmar.

Weeks later the patient was still waiting to travel as tensions between the Myanmar army and local rebels closed transportation routes.

Myanmar reached so-called ‘elimination’ status for leprosy in 2003 — meaning less than one person per 10,000 has the illness.

But there are still around 3,000 new cases found each year and medical workers warn that the debilitating disease could be on the rise once more as the country’s creaking healthcare system fails to reach those at risk.

Decades of civil war in ethnic regions have also left vast swathes of its border areas cut off from all but the most basic medical help, meaning the disease could be passing undetected.

“There can be pocket areas, hidden areas,” Saw Hsar Mu Lar told AFP.

“We have to tell the world that it’s not finished yet.”

- A curable curse -

Leprosy is one of the world’s oldest — and most feared — diseases.

The bacteria affects the skin and deadens the nerves, meaning sufferers are prone to injure themselves, which results in ulcers and can lead to limb loss. Symptoms can take as long as 20 years to appear.

It is not particularly infectious, passing only through close contact over long periods, and modern medicine is able to cure patients relatively quickly.

But Myanmar has one of the world’s least developed medical systems, with government funding consistently among the lowest of any country, even with recent increases under a post-junta semi-civilian government.

State health workers are technically in charge of outreach and aid groups are banned from conducting leprosy awareness campaigns or looking for new patients — although they can treat people they find through dermatology clinics and during follow-up field trips.

The respected local aid group that organised the border expedition asked AFP not to give specific details of their work fearing that it could jeopardise future missions.

Saw Hsar Mu Lar’s Mawlamyaing Christian Leprosy Hospital, with its bright, simple wards, trained staff and plentiful supply of drugs, is a medical haven — funded mainly by international donations.

Most of the patients AFP met were farmers or had turned to begging to make ends meet.

“We had no medicine at our village even though we had a clinic,” said 40-year-old Mu Hai, who had travelled from western Rakhine state for treatment.

The hospital’s matron, Ni Ni Thein, is worried. In 2011 they saw 58 new leprosy cases, but that rose to 62 in 2012 and 68 last year.

“Now cases are increasing… the complication rate is increasing,” she said, adding that the age range for the disease had also appeared to have widened, with one four-year-old treated this year.

The fight to stop leprosy has been a major international success, with around 16 million people cured by multi-drug therapy (MDT) medicine in the last two decades.

But experts warn against complacency.

Myanmar is one of 18 countries that together account for almost all new cases of the disease.

The number of new cases it finds annually is dwarfed by its populous neighbour India, where there were some 127,000 new patients identified in 2011 according to World Health Organisation figures.

But while India managed an over 50 percent reduction between 2004 and 2011, Myanmar struggled to reduce its new incidences by 18 percent.

The WHO’s goodwill ambassador on leprosy, Yohei Sasakawa, said stagnation in Myanmar’s new case numbers over several years could indicate authorities are not doing enough to root out the disease.

One problem is that the numbers affected seem small compared to other health challenges like HIV, tuberculosis and malaria.

“It is quite easy to be brought down the priority list,” he told AFP during a recent mission to the country.

- ‘He shall dwell alone’ -

Even if patients are cured, many around the world still fall victim to the stigma that clings to the disease, ending up living in segregated colonies.

Public vilification dates back over two thousand years.

The Bible says of leprosy sufferers: “he is unclean: he shall dwell alone”.

Saw Roger was chased out of his village when he started to show signs of leprosy aged 18 in the 1950s.

“I lived only with the animals in the jungle and I was frightened. I used to go into my village under the moonlight and I took rice and fish paste before going back into the dark forest,” the 76-year-old told AFP.

After two years sleeping in the woods, Roger was found by missionaries and taken to the Mawlamyaing hospital.

Roger, whose legs, left hand and eye have been ravaged by the disease, has found sanctuary there ever since.

Passing the time reading and leading the church choir, he said he has found happiness despite a lifetime of travails caused by the illness.

“I can continue to look forward,” he added.


* myramar.jpg (52.15 KB, 615x345 - viewed 27 times.)
Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1172 on: Aug 21, 2014, 07:22 AM »

New study highlights precarious state of the world’s primary forests

By RedOrbit
August 20, 2014

An estimated 95 percent of the primary forests that existed prior to the advent of agriculture have been lost in non-protected areas, according to new research published online Thursday in the Society for Conservation Biology journal Conservation Letters.

The paper, which was prepared by an international team of experts in forest ecology, conservation biology, international policy and practical forest conservation issues, details what the authors are calling a global analysis of the ecosystem also known as old-growth forests and also features a map illustrating their findings.

Lead researcher Professor Brendan Mackey, Director of the Climate Change Response Program at Griffith University in Queensland, Australia and colleagues from organizations such as the US Wildlife Conservation Society, the Zoological Society of London, the Geos Institute and Australian National University conclude that primary forest protection is a global concern and should be the responsibility of both developed and developing countries.

In a statement, the Wildlife Conservation Society said that old-growth forests, which are “forests where there are no visible indications of human activities, especially industrial-scale land use, and ecological processes have not been significantly disrupted,” have been “largely ignored by policy makers and under increasing land use threats.”

The organization added that these forests “are home to an extraordinary richness of biodiversity, with up to 57 percent of all tropical forest species dependent on primary forest habitat and the ecological processes they provide.”

Their analysis has determined that nearly 98 percent of all primary forests can be found in 25 countries, and that roughly half of that figure is located in just five developed nations: Australia, Canada, New Zealand, Russia and the US.

Professor Mackey cautions that human activities such as industrial logging, mining and agriculture pose a grave threat to these forest lands, especially those located outside of protected areas. He also said that new policies were urgently needed in order to reduce the pressure to make primary forests available for industrial land use.

“International negotiations are failing to halt the loss of the world's most important primary forests,” he explained. “In the absence of specific policies for primary forest protection in biodiversity and climate change treaties, their unique biodiversity values and ecosystem services will continue to be lost in both developed and developing countries.”

“Primary forests are a matter of significant conservation concern. Most forest-endemic biodiversity needs primary forest for their long-term persistence and large intact forest landscapes are under increasingly pressure from incompatible land use,” added co-author James Watson of the Wildlife Conservation Society.

Mackey, Watson and their colleagues devised four new actions that they believe could serve as a foundation for new international forest-protection policies, starting with the recognition of primary forests as a matter of global concern and not just an issue in developing countries.

They are also calling for the incorporation of these forests into environmental accounting, including acknowledgement of their services to the ecosystem, including freshwater and watershed services, and the use of a science-based definition to distinguish primary forests. In addition, they are calling for policies seeking to avoid further biodiversity loss and emissions from primary forest deforestation and degradation to become a priority.

Finally, they are calling for the universal acceptance of the important role that indigenous and community conserved areas play in the protection of these forests, calling on governments to use this issue as “a mechanism within multilateral environmental agreements to support sustainable livelihoods for the extensive populations of forest-dwelling peoples, especially traditional peoples, in developed and developing countries.”


* forests.jpg (55.81 KB, 613x344 - viewed 23 times.)

* forests1.jpg (20.69 KB, 350x350 - viewed 25 times.)
Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1173 on: Aug 22, 2014, 06:17 AM »


Global warming slowdown answer lies in depths of Atlantic, study finds

Excess heat being stored hundreds of metres down in Atlantic and Southern oceans – not Pacific as previously thought

Adam Vaughan   
The Guardian, Thursday 21 August 2014 21.35 BST      

The key to the slowdown in global warming in recent years could lie in the depths of the Atlantic and Southern Oceans where excess heat is being stored – not the Pacific Ocean as has previously been suggested, according to new research.

But the finding suggests that a naturally occurring ocean cycle burying the heat will flip in around 15 years’ time, causing global temperature rises to accelerate again.

The slowdown of average surface temperature rises in the last 15 years after decades of rapid warming has been seized on by climate change sceptics and has puzzled scientists, who have hypothesised that everything from volcanic eruptions and sulphur from Chinese power stations to heat being trapped deep in the oceans could be the cause. Several studies have focused on the Pacific as potentially playing a major role.

The new study, published in the journal Science on Thursday, concludes that the Pacific alone cannot explain the warming “hiatus” and that much of the heat being trapped by greenhouse gases at record levels in the atmosphere is being sunk hundreds of metres down in the Atlantic and Southern Oceans.

Ka-Kit Tung, author of the paper and University of Washington professor, said: “The finding is a surprise, since the current theories had pointed to the Pacific Ocean as the culprit for hiding heat. But the data are quite convincing and they show otherwise.”

“We are not downplaying the role of the Pacific. They are both going on [the oceans having an effect on temperatures]; one is short term [the Pacific], one is long term [the Atlantic],” he told the Guardian.

A shift in the salinity of the north Atlantic triggered the effect around the turn of the century, the study says, as surface water there became saltier and more dense, sinking and taking surface heat down to depths of more than 300 metres.

Using temperature data from floats across the world, Tung found the Atlantic and Southern Oceans “each account for just under half the global energy storage change since 1999 at below 300m”. The study’s result, he says, does not support the “Pacific-centric” view of earlier work on whether heat is being stored.

“We were surprised to see the evidence presented so clearly. When you go with the energy, you cannot argue with that,” said Tung.

Jon Robson, a climate scientist at the University of Reading and who is unconnected to the study, said the new work did not disprove evidence of the Pacific’s role in the warming slowdown.

“The hiatus really is a patchwork problem of lots of different things, volcanoes, the Pacific, the Atlantic. This paper does elevate the Atlantic’s role, which has been largely ignored before. This does suggest a role for the Atlantic but there’s a lot more to it than that,” he told the Guardian.

“It doesn’t dispel the key role for the Pacific in the hiatus. There is evidence that the hiatus is a northern hemisphere winter phenomenon, which does point the finger quite strongly to the Pacific.”

Piers Forster, professor of climate change at the University of Leeds, said: “This paper suggests that heat disappearing into the depths of the Atlantic and Southern Oceans are the dominant cause. Their ideas seem fine but I’m also convinced there is more going on: the El Niño and relative cooler European and Asian winters remain important aspects to understand.”

The study, Varying Planetary Heat Sink Led to Global-Warming Slowdown and Acceleration, gives little room for complacency that the oceans can safely store heat caused by human activities because the cycle that buries the heat deep in the Atlantic will “inevitably” switch back. Heat would then no longer be removed deep underwater, leading to “another episode of accelerated warming” at the surface.

Forster added: “Most importantly, this paper is another a nail in the coffin of the idea that the hiatus is evidence that our projections of long-term climate change need revising down. Variability in the ocean will not affect long-term climate trends but may mean we have a period of accelerated warming to look forward to.”

This February, the national science academies of the US and UK said the global warming slowdown did not “invalidate” the long-term trend of rising temperatures caused by man-made climate change.


* 126bc96f-2314-42c9-a24f-a0b35a63d705-460x276.jpeg (75.86 KB, 460x276 - viewed 22 times.)

* b65379bc-1bea-413b-82a2-97ab192b395b-bestSizeAvailable.jpeg (62.04 KB, 460x407 - viewed 26 times.)
Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1174 on: Aug 25, 2014, 06:21 AM »

Scientists alarmed at ‘incredible’ rate of ice sheet depletion

By Robin McKie, The Observer
Sunday, August 24, 2014 21:04 EDT

The planet’s two largest ice sheets – in Greenland and Antarctica – are now being depleted at an astonishing rate of 120 cubic miles each year. That is the discovery made by scientists using data from CryoSat-2, the European probe that has been measuring the thickness of Earth’s ice sheets and glaciers since it was launched by the European Space Agency in 2010.

Even more alarming, the rate of loss of ice from the two regions has more than doubled since 2009, revealing the dramatic impact that climate change is beginning to have on our world.

The researchers, based at Germany’s Alfred Wegener Institute Helmholtz Centre for Polar and Marine Research – used 200m data points across Antarctica and 14.3m across Greenland, all collected by CryoSat, to study how the ice sheets there had changed over the past three years. The satellite carries a high-precision altimeter, which sends out short radar pulses that bounce off the ice surface and then back to the satellite. By measuring the time this takes, the height of the ice beneath the spacecraft can be calculated.

It was found from the average drops in elevation that were detected by CryoSat that Greenland alone is losing about 90 cubic miles a year, while in Antarctica the annual volume loss is about 30 cubic miles. These rates of loss – described as “incredible” by one researcher – are the highest observed since altimetry satellite records began about 20 years ago, and they mean that the ice sheets’ annual contribution to sea-level rise has doubled since 2009, say the researchers whose work was published in the journal Cryosphere last week.

“We have found that, since 2009, the volume loss in Greenland has increased by a factor of about two, and the West Antarctic ice sheet by a factor of three,” said glaciologist Angelika Humbert, one of the study’s authors. “Both the West Antarctic ice sheet and the Antarctic peninsula, in the far west, are rapidly losing volume. By contrast, East Antarctica is gaining volume, though at a moderate rate that doesn’t compensate for the losses on the other side of the continent.”

The researchers say they detected the biggest elevation changes caused by ice loss at the Jakobshavn glacier in Greenland, which was recently found to be shifting ice into the oceans faster than any other ice-sheet glacier, and at Pine Island glacier, which like other glaciers in West Antarctica, has been thinning rapidly in recent years.

The discovery of these losses of ice is particularly striking and represents yet another blow to claims by some climate-change deniers, who argue that the rapid loss of ice in the Arctic currently being observed is being matched by a corresponding increase in Antarctica. CryoSat’s measurements show that Antarctica – although considerably colder than the Arctic because of its much higher average elevation – is not gaining ice at all. Indeed, it is – overall – losing considerable volumes, and in the case of West Antarctica is doing so at an alarming rate.

This point was stressed by Mark Drinkwater, the European Space Agency’s CryoSat mission scientist. “These results offer a critical new perspective on the recent impact of climate change on large ice sheets. This is particularly evident in parts of the Antarctic peninsula, where some of the more remarkable features add testimony on the impact of sustained peninsula warming at rates several times the global average.”

guardian.co.uk © Guardian News and Media 2014

Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1175 on: Aug 27, 2014, 06:00 AM »

U.N. Draft Report Lists Unchecked Emissions’ Risks

By JUSTIN GILLIS
AUG. 26, 2014
IHT

Runaway growth in the emission of greenhouse gases is swamping all political efforts to deal with the problem, raising the risk of “severe, pervasive and irreversible impacts” over the coming decades, according to a draft of a major new United Nations report.

Global warming is already cutting grain production by several percentage points, the report found, and that could grow much worse if emissions continue unchecked. Higher seas, devastating heat waves, torrential rain and other climate extremes are also being felt around the world as a result of human-produced emissions, the draft report said, and those problems are likely to intensify unless the gases are brought under control.

The world may already be nearing a temperature at which the loss of the vast ice sheet covering Greenland would become inevitable, the report said. The actual melting would then take centuries, but it would be unstoppable and could result in a sea level rise of 23 feet, with additional increases from other sources like melting Antarctic ice, potentially flooding the world’s major cities.

“Human influence has been detected in warming of the atmosphere and the ocean, in changes in the global water cycle, in reduction in snow and ice, and in global mean-sea-level rise; and it is extremely likely to have been the dominant cause of the observed warming since the mid-20th century,” the draft report said. “The risk of abrupt and irreversible change increases as the magnitude of the warming increases.”

The report was drafted by the Intergovernmental Panel on Climate Change, a body of scientists and other experts appointed by the United Nations that periodically reviews and summarizes climate research. It is not final and could change substantially before release.

The report, intended to summarize and restate a string of earlier reports about climate change released over the past year, is to be unveiled in early November, after an intensive editing session in Copenhagen. A late draft was sent to the world’s governments for review this week, and a copy of that version was obtained by The New York Times.

Using blunter, more forceful language than the reports that underpin it, the new draft highlights the urgency of the risks that are likely to be intensified by continued emissions of heat-trapping gases, primarily carbon dioxide released by the burning of fossil fuels like coal, oil and natural gas.

The report found that companies and governments had identified reserves of these fuels at least four times larger than could safely be burned if global warming is to be kept to a tolerable level.

That means if society wants to limit the risks to future generations, it must find the discipline to leave a vast majority of these valuable fuels in the ground, the report said.

It cited rising political efforts around the world on climate change, including efforts to limit emissions as well as to adapt to changes that have become inevitable. But the report found that these efforts were being overwhelmed by construction of facilities like new coal-burning power plants that will lock in high emissions for decades.

From 1970 to 2000, global emissions of greenhouse gases grew at 1.3 percent a year. But from 2000 to 2010, that rate jumped to 2.2 percent a year, the report found, and the pace seems to be accelerating further in this decade.

A major part of the jump was caused by industrialization in China, which now accounts for half the world’s coal use. Those emissions are being incurred in large part to produce goods for consumption in the West.

Emissions are now falling in nearly all Western countries because of an increased focus on efficiency and the spread of lower-emitting sources of electricity. But the declines are not yet sufficient to offset rising emissions in developing countries, many of whose governments are focused on pulling their people out of poverty.

The new report found that it was still technically possible to limit global warming to an internationally agreed upper bound of 3.6 degrees Fahrenheit, or 2 degrees Celsius, above the preindustrial level. But continued political delays for another decade or two will make that unachievable without severe economic disruption, the report said.

The draft report comes a month before a summit meeting of world leaders in New York that is meant to set the stage for a potential global agreement on emissions that would be completed next year. However, concern is growing among climate experts that the leaders may not offer ambitious commitments in their speeches on Sept. 23, a potential continuation of the political inaction that has marked the climate issue for decades.

The draft report did find that efforts to counter climate change are gathering force at the regional and local level in many countries. This is especially clear in the United States, where Congress is paralyzed and the national government has effectively ceded leadership on climate to states like California, Massachusetts and New York.

President Obama, using his executive authority under the Clean Air Act, is seeking to impose national limits on emissions of greenhouse gases, but he faces profound legal and political challenges as he seeks to put his policy into effect before leaving office in early 2017.

The draft report found that past emissions, and the failure to heed scientific warnings about the risks, have made large-scale climatic shifts inevitable. But lowering emissions would still slow the expected pace of change, the report said, providing critical decades for human society and the natural world to adapt.

“Continued emission of greenhouse gases will cause further warming and long-lasting changes in all components of the climate system, increasing the likelihood of severe, pervasive and irreversible impacts for people and ecosystems,” the report said.

The earth has so far warmed by about 1.5 degrees Fahrenheit above the level that prevailed before the Industrial Revolution, the report found, and that seemingly modest increase is causing the effects already being seen around the world. A continued rapid growth of emissions in coming decades could conceivably lead to a global warming exceeding 8 degrees Fahrenheit, the report found. The warming would be higher over land areas, and higher still at the poles.

Warming that substantial would almost certainly have catastrophic effects, including a mass extinction of plants and animals, huge shortfalls in food production, extreme coastal flooding and many other problems, the report found.

The report noted that severe weather events, some of them linked to human-produced emissions, had disrupted the food supply in recent years, leading to several spikes in the prices of staple grains and destabilizing some governments in poorer countries.

Continued warming, the report found, is likely to “slow down economic growth, make poverty reduction more difficult, further erode food security, and prolong existing poverty traps and create new ones, the latter particularly in urban areas and emerging hot spots of hunger.”


* 27warming-master1050.jpg (131.68 KB, 1050x701 - viewed 20 times.)
Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1176 on: Aug 27, 2014, 06:44 AM »


New coal power stations threat to EU’s emissions target

Burning Europe’s lignite reserves would wipe out carbon budget from 2020 until the end of the century, says Greenpeace

Karl Mathiesen   
theguardian.com, Wednesday 27 August 2014 08.00 BST      

New coal power stations designed to burn Europe’s massive deposits of lignite pose a serious threat to the continent’s decarbonisation efforts, according to figures released on Wednesday.

Analysts from Greenpeace’s Energydesk compiled data from the German government that shows burning Europe’s reserves of lignite would wipe out the EU’s entire carbon budget from 2020 until the end of the century.

Lignite – also known as brown coal – power stations currently make up more than 10% of the EU’s total CO2 emissions. Greenpeace said that if Europe is to continue to play its part in keeping the world within the internationally accepted limit of 2C of warming, 90% of the carbon contained in its lignite reserves must remain buried.

Despite this, lignite-fuelled power stations are still being built, locking in consumption of the fuel for decades. There are 19 such facilities in various stages of approval, planning or construction in Bulgaria, Czech Republic, Greece, Germany, Poland, Romania and Slovenia. Greenpeace figures show these new projects alone would emit almost 120m tonnes of CO2 every year – equivalent to three-quarters of the annual carbon output of the UK’s energy sector. The average lifespan for a coal power station is about 40 years, meaning the plants could release nearly 5bn tonnes of CO2 into the atmosphere.

Greenpeace energy analyst Jimmy Aldridge said: “The expansion of lignite mining in Europe is today the most serious symptom of the continent’s chronic addiction to dangerous fossil fuels, and a massive threat to its efforts to tackle climate change. The companies involved will continue for as long as they can – we need our political leaders to act in order to stop this situation from getting worse. [Barack] Obama has taken decisive action against coal in the US, it’s time European leaders did the same.”

But Max Grünig, an energy sector economist at the Ecologic Institute in Germany, said the continued burning of lignite did not represent an immediate threat to carbon targets.

“The total cap for the EU emissions trading scheme is decreasing and ensures that the energy sector is on track to meet the climate targets. If there are more coal power plants, they will have to compensate by buying emissions allowances from other sectors, but net emissions of CO2 cannot increase.”

Grünig said the strategic reliance on lignite in politically powerful Poland and Germany had made these countries resistant to more ambitious emissions targets. Poland was especially belligerent during the process of creating the 2030 emissions target of a 40% reduction by 2030.

“Poland is in the sad position to depend mostly (above 90%) on coal and has therefore very high costs associated with stricter emissions targets. Thus, they have to resist proposals for stricter targets in their own financial interest. It’s a trap,” he said.

On Saturday, 7,500 Greenpeace protesters created a human chain across the German-Polish border. The symbolic gesture linked two villages earmarked for destruction to make room for the planned expansion of an open cut lignite mine.

On the German side of the border in Lusatia, Sweden’s state energy company Vattenfall operates mines and power stations that emit as much CO2 as the whole of Sweden, according to Greenpeace figures. The company is looking to expand its mining operations at three of its existing sites. Greenpeace claims Vattenfall also has two entirely new mines in the planning process as well as a new power station. Vattenfall says there are no new mines being planned. But a spokesperson for Vattenfall said new power infrastructure was dependent on the approval of expanded mining operations.

“As long as the extension of Jänschwalde opencast mine will be agreed by the states government of Brandenburg, in a midterm there will be needed a newly built power plant in Jänschwalde.”

Lignite is an increasingly important part of the world’s energy supply. Consumption in Europe has remained stable since the late 1990s, but grew slightly over the past few years on the back of high gas prices and the scaling back of nuclear power in Germany. In Poland, lignite burning increased 3.7% in 2012, a year when power demand was actually falling. A recent analysis by Standard and Poor’s found that 50-60% of the world’s coal reserves were low-quality, high-emitting coals such as lignite.   


* d5f2269a-4c08-470e-8693-bc5e2c547c78-460x276.jpeg (72.03 KB, 460x276 - viewed 17 times.)
Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1177 on: Aug 27, 2014, 06:46 AM »


UK lobbying to keep open one of Europe's dirtiest coal power stations

Government seeks exemption from European law that would close down Aberthaw for its excessive nitrogen oxide emissions

Karl Mathiesen   
theguardian.com
27August 2014 10.42 BST   

The UK government is lobbying the European commission (EC) to keep open one of Europe’s dirtiest coal power stations, even though its nitrogen oxide (NOx) emissions exceed new legal limits by five times.

The EC has begun infraction proceedings against the UK because its proposals for reducing emissions under new European laws have been littered with “inconsistent or missing” data.

Aberthaw power plant in south Wales was named in the top 30 highest carbon-emitting plants in Europe by an alliance of NGOs last month. But its emissions of NOx, which causes respiratory problems and lung disease, are also extremely high.

The plant is specifically designed to burn coal from the local area, which is unusually difficult to ignite and needs a chemical catalyst added to make it burn. This process results in NOx emissions of around 1,000 mg/Nm3. The limit on NOx set by the European industrial emissions directive (IED) is 200 mg/Nm3. Under this new law, the 1,555MW plant would have to be shut down by 2016.

The EC and UK disagree on whether the plant qualifies for an exemption from the limits. An EC spokesperson confirmed that the UK continues to lobby for the plant to stay open, adding that “for the time being, the closure of the plant is not considered”.

A spokesperson for the Welsh government, which is leading negotiations on Aberthaw, said it hoped to come to an “acceptable resolution” but would not specify what the government defined as acceptable. He confirmed the UK’s Department of Environment, Farming and Rural Affairs (Defra) was also “heavily involved”.

Defra refused to comment, saying infraction cases are subject to a confidentiality agreement.

Greenpeace energy analyst Jimmy Aldridge said: “The fact that it’s [Defra] that is trying to keep one of the most polluting power stations in Europe open is beyond parody. If the government’s special pleading is successful then Aberthaw will continue to produce toxic emissions harming people’s health for years to come and seriously damage our efforts to tackle climate change.”

Aberthaw currently receives an exemption from the EC that allows it to emit up to 1,200 mg/Nm3 of NOx. This is based on previous UK pleas to allow the plant to continue because it was using low-volatility indigenous coal and supporting a major local industry.

The 43-year-old station operators, RWE npower, is now mixing local coal with other types of coal, that is easier to combust. Currently 30% of the coal on site is from overseas, mainly Russia. The EC exemption is based on the plant burning coal of less than 10% volatility. Now, it uses coals of between 6% and 15% and the UK has not provided precise data on the annual average volatility.

The EC says that it is therefore no longer eligible for the extra emissions.

The government disagrees with the EC and says Aberthaw should be treated as a low-volatility plant. Defra has submitted two Transitional National Plans (TNPs), which outline to the EC how emissions will be reduced, which include the maximum exemption of 1,200 mg/Nm3 for Aberthaw. It is unclear how long the UK is asking for the exemption to last. The EC has rejected both proposals and commenced formal infraction proceedings against the UK.

Unions are now worried that the conflict could leave no room for compromise and the plant could close in less than two years.

Aldridge says that if Aberthaw receives the full allowance of 1,200 mg/Nm3, it might not just water down emissions reductions for one plant, but the whole country. Under the UK TNP, plants would be given tradeable credits that they can sell to other more polluting plants if they reduce their own emissions.

In July, RWE npower announced plans to retrofit low-emission boilers to Aberthaw, with the potential to reduce NOx emissions by 60%. This could flood the UK market with meaningless emissions credits.

A spokesperson for rival power company Eon said special treatment should be avoided. “Our view is very much that each time special treatment is granted for a specific named plant, investments in emissions abatement that have been made in good faith, on the assumption that regulations will be implemented consistently, are undermined.”


* 66951262-72fb-4730-8f9b-05553e6d2521-460x276.jpeg (42.78 KB, 460x276 - viewed 22 times.)
Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1178 on: Aug 27, 2014, 06:48 AM »


Big power out, solar in: UBS urges investors to join renewables revolution

World’s largest private bank predicts large-scale power stations will soon make way for electric cars and new solar technologies

John Vidal   
theguardian.com, Wednesday 27 August 2014 11.40 BST   
   
Big power stations in Europe could be redundant within 10-20 years as electric cars, cheaper batteries and new solar technologies transform the way electricity is generated, stored and distributed, say analysts at the world’s largest private bank.

In a briefing paper sent to clients and investors this week, the Zurich-based UBS bank argues that large-scale, centralised power stations will soon become extinct because they are too big and inflexible, and are “not relevant” for future electricity generation. Instead, the authors expect it to be cheaper and more efficient for households and businesses to generate their own energy to power their cars and to store any surplus energy in their own buildings even without subsidies.

In language more closely associated with green NGOs, the bank with assets of more than $1.5tn says it expects a paradigm shift away from large-scale conventional power plants. “Power is no longer something that is exclusively produced by huge, centralised units owned by large utilities. By 2025, everybody will be able to produce and store power. And it will be green and cost competitive, ie, not more expensive or even cheaper than buying power from utilities,” say the authors, who urge their financial clients to “join the revolution.”

“Solar is at the edge of being a competitive power generation technology. The biggest drawback has been its intermittency. This is where batteries and electric vehicles (EVs) come into play. Battery costs have declined rapidly, and we expect a further decline of more than 50% by 2020. By then, a mass [produced] electric vehicle will have almost the same price as a combustion engine car. But it will save up to €2,000 (£1,600) a year on fuel cost, hence, it will begin to pay off almost immediately without any meaningful upfront ‘investment’. This is why we expect a rapidly growing penetration with EVs, in particular in countries with high fossil fuel prices.”

The expected 50% reduction in the cost of batteries by 2020 will not just spur electric car sales, but could also lead to exponential growth in demand for stationary batteries to store excess power in buildings, says UBS. “Battery storage should become financially attractive for family homes when combined with a solar system and an electric vehicle. As a consequence, we expect transformational changes in the utility and auto sectors,” it says. “By 2020 investing in a home solar system with a 20-year life span, plus some small-scale home battery technology and an electric car, will pay for itself in six to eight years for the average consumer in Germany, Italy, Spain, and much of the rest of Europe.”

By 2025, falling battery and solar costs will make electric vehicles cheaper than conventional cars in most European markets. “As a conservative 2025 scenario, we think about 10% of new car registrations in Europe will be EVs. Households and businesses who invest in a combined electric car, solar array and battery storage should be able to pay the investment back within six to eight years,” UBS says. “In other words, based on a 20-year technical life of a solar system, a German buyer should receive 12 years of electricity for free.”

But the bank does not expect power companies or the grid to disappear: UBS says they have a future if they develop smart grids which manage electricity demand more efficiently and provide decentralised back-up power generation.

“Electric vehicles are the key catalyst for driving mass adoption of battery storage technologies, as autos will fast-track mass production, which will be significant in driving down costs. We see battery costs moving down from $360/kWh today to $200/kWh by 2020, and as low as $100/kWh within 10 years. We believe that by 2020, lithium battery pack cost will drop by more than 50%, compared to 2013.

The UBS report follows similar analysis by other large financial institutions and energy experts who expect new solar and renewable technologies to drive rapid change in large scale utility companies. Earlier this year, Michael Liebreich, founder and CEO of Bloomberg New Energy Finance, said: “The fact is that wind and solar have joined a long list of clean energy technologies – geothermal power, waste-to-energy, solar hot water, hydropower, sugar-cane based ethanol, combined heat and power, and all sorts of energy efficiency – which can be fully competitive with fossil fuels in the right circumstances.

“In most sunny parts of the world it is cheaper to generate power from photovoltaic modules on your roof than to buy it from your utility. The best newly built windfarms are selling power at the equivalent of 3p/kWh before subsidies, which neither gas, nor coal, nor nuclear power can match. LED lightbulbs can be bought for a few pounds, providing home-owners a quick and cheap way of cutting their utility bills. What is even more important is that the cost reductions that have led to this point are set to continue inexorably, far out into the future.”


* f6e0ca7d-040c-4d7e-bca2-4b29fdb345f6-460x276.jpeg (75.2 KB, 460x276 - viewed 22 times.)
Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1179 on: Aug 27, 2014, 07:12 AM »

Mysterious half-mile crack splits open the ground in northwest Mexico

By David Ferguson
RawStory
Tuesday, August 26, 2014 10:59 EDT

Researchers are puzzling over a massive half-mile crack in the ground in a dry, desert region of Mexico’s northwest.

The New York Daily News reported that the fissure — which is up to 26 feet deep in some spots — formed near Hermosillo in the state of Sonora.

According to The Weather Channel, the crack opened on Aug. 15 and split Mexico’s Highway 36 so that drivers were forced to turn around and find detours.

Some scientists speculated that the crack was caused by seismic activity, while others said it was probably the result of an underground stream drying up, which created a void beneath the surface that then collapsed in on itself.

Martin Moreno Valencia, Chief Regional Station of the Institute of Geology at UNAM in Hermosillo told the Weather Channel that it is unlikely that an earthquake caused the rift to open because the two sides of the crack are level with each other. Generally, in earthquake damage, one side of a rift will jut up higher.

It has been a summer of mysterious holes opening in the Earth. U.K. newspaper the Mirror reported that a 100-foot sinkhole opened in northeast England’s County Durham. The region has been experiencing heavy rain, so the hole — which locals say is so deep that you “can’t see the bottom” — is continuing to expand outward, and may soon threaten local farmhouses.

That sinkhole is believed to be related to mining activities in the area.

In Siberia, researchers are studying two giant craters that opened up in the ground earlier this summer. The holes are believed to have been caused by pockets of methane gas escaping from the ground as the region’s permafrost melts after two summers that set high temperature records.


* Hermosillo-crack-in-ground-via-screencap-615x345.png (278.53 KB, 615x345 - viewed 23 times.)
Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1180 on: Aug 28, 2014, 06:39 AM »

What global warming might mean for extreme snowfalls

By Climate Central
Wednesday, August 27, 2014 15:25 EDT

So if the world is warming, that means winters should be less snowy, right? Well, it’s a bit more complicated than that. OK, it’s a lot more complicated.

While the average annual snowfall in most parts of the world is indeed expected to decline, the extreme snowfalls — those that hit a place once every 10 or 20 years and can cause major headaches and economic impacts — may decline at a slower rate, and could even increase in particularly cold places, a new study detailed in the Aug. 28 issue of the journal Nature finds.

Essentially, in a warming world, there are “more muted changes in [the intensity of] snowfall extremes than in average snowfall,” said study author Paul O’Gorman, a climate researcher at MIT.

The definition of extreme snowfall of course depends on where you are: For Boston, where O’Gorman lives and works, an extreme snow event might dump a couple feet of snow on the city, but “what’s extreme for Atlanta would be quite different,” he told Climate Central. “It really depends on where you are.”

Because the amount of feet in an extreme snowfall would be so dependent on the place, O’Gorman defined extremes by return times, so storms that happen only once every decade or two, which takes subjective snow depths out of the equation.

O’Gorman was curious about what climate models would say about the future of extreme snowfall, as few studies have looked at it, unlike average snowfalls. He took advantage of simulations that had been run on 20 different climate models (under a scenario where greenhouse gas emissions increase throughout the 21st century) from centers around the world and did a statistical analysis to see what they projected for changes in average and extreme snowfall in the Northern Hemisphere by the end of the century.

Extreme Weather 101: Rising Temps & Snowstorms: click to watch

https://www.youtube.com/watch?v=r8oVoXaLRL0


The models suggested that the intensity of extreme snowfalls would decline less than the average annual snowfall in many regions. The exact numbers play out differently depending on the region, but, as an example: At low elevations (below about 3,330 feet) with monthly temperatures just below freezing, the average snowfall declines by 65 percent, but the intensity of extreme snowfall declines only 8 percent.

To picture what that means, let’s go back to those snow depths and return times (the numbers here are not from O’Gorman’s study and are an arbitrary example): If a 1-in-20 year snowfall event in Boston now would bury the city in, say, 3 feet of snow, that same event might dump only 2.5 feet of snow in a warmer late-century climate. But the average annual snowfall Boston might see would drop even more. To put that in terms of return times, a 3-foot snow there might become a 1-in-25 year event by century’s end.

While the return times hint at the frequency of such extreme snowfalls, and the intensity of snows are related to frequency, O’Gorman cautions that his study didn’t actually look at how often different regions might expect intense snows, just the amount of snow in, say, a 20-year storm.

The reason the change in intensity of extreme snowfalls seems to behave differently than the overall snowfall picture has to do with the physics that govern the formation of extreme snows. It seems that intense snows develop in a very narrow band of temperatures — it has to be cold enough that the precipitation won’t fall as rain, but can’t be so cold that the air doesn’t have enough moisture in it to fuel a blizzard.

In contrast, the snow that combines to give the annual total encompasses a much broader range of snow types that form under a wider swath of temperatures and so are more affected by warming. Essentially, in some places, less warming is needed to eat away at the temperature range that produces all snow than just the small range that accounts for extreme snows.

“It does make sense that when the overall climate is warming that your baseline snowfalls are going to decrease,” but you can still “pop a big snowstorm,” said David Robinson, the New Jersey state climatologist and the director of the Global Snow Lab at Rutgers University.

One caveat is that in particularly mild regions that already don’t see much snowfall, a sufficient amount of warming could knock out both the extremes and the average, O’Gorman said. (On the opposite end, places that are cold enough could actually see an increase in extreme snowfalls.)

Robinson, whose own research into snowfall trends hasn’t shown anything clear one way or the other, said that the study essentially investigates the amount of water content in snows (or what you’d get if you melted all the snow into water) and doesn’t address the issue of water content vs. snow depth. A very dense 2 feet of snow could be just as damaging as a greater depth of less dense snow, he said.

O’Gorman’s study is a starting point, and he said that future work needs to look into what the observations of snowfall show and investigate when different regions might start seeing this expected climate signal in extreme snowfalls.

“I think it’s important that people are looking at these individual variables,” Robinson said, because they help scientists get a sense of where to look for change and where not to expect as much. “It’s telling you where to go look.”

Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1181 on: Aug 29, 2014, 07:06 AM »

Coal plants lock in 300 billion tons of CO2 emissions

By Climate Central
Thursday, August 28, 2014 14:24 EDT

It seems straightforward to say that when you buy a new car by taking out a loan, you’re committing to spending a certain amount of your income per month on that car for a specific period of time.

Of course, by buying that car, you’re also committing to polluting the atmosphere with some amount of carbon dioxide. But how often do car buyers make that calculation?

The same can be said for coal-fired power plants, which spew billions of tons of climate-changing CO2 into the atmosphere each year, and continue to be built across the globe.

Coal-fired power plants are the largest contributors to the atmospheric CO2 concentrations,  which last year reached 400 parts per million (ppm) for the first time in human history — up from 280 ppm in pre-industrial times.

While utilities account for operating costs, few ever calculate how much CO2 those power plants will emit into the atmosphere during their lifespans, according to a new studyconducted by Princeton University and University of California-Irvine.

That’s a huge problem for the climate because more new coal-fired power plants have been built worldwide in the past decade than in any previous decade, with no sign of slowing down, the study says.

Those existing coal-fired power plants emit billions of tons of CO2 each year and account for about 26 percent of global greenhouse gas emissions — double that of the transportation sector. In the U.S. alone, burning coal emitted 1.87 billion tons of CO2 in 2011, according to the U.S. Energy Information Administration. Worldwide, coal-burning released 14.4 billion tons of CO2 in 2011.

But the study extends those emissions out to the full lifespan of each of the existing power plants — 40 years per plant — and estimates that together they will spew out 300 billion tons of CO2 before they are retired, up from 200 billion tons of CO2 emissions that were committed from the power plants that existed in 2000, the study says.

In other words, the power plants operating today are committed to emitting 300 billion tons of CO2 in the future, enough to contribute an additional 20 ppm of CO2 to the atmosphere globally, Princeton University professor emeritus of mechanical and aerospace engineering and study co-author Robert Socolow told Climate Central.

Estimating future emissions is called “commitment accounting,” according to the study.

When those existing coal-fired power plants are shut down, current trends in China and other developing nations suggest that new ones will replace them, committing the globe to even more CO2 emissions at a time when the climate can least tolerate it, Socolow said.

Calculating CO2 emissions commitments from power plants is almost never done because CO2 emissions are reported to the United Nations based on emissions in a single year rather than those expected in future years, the study says.

“Bringing down carbon emissions means retiring more fossil fuel facilities than we build,” study lead author Steven Davis, assistant professor of earth system science at UC-Irvine, said in a statement. “But worldwide, we’ve built more coal-burning power plants in the past decade than in any previous decade, and closures of old plants aren’t keeping pace with this expansion.”

In the U.S., the Obama administration has set a goal under the Clean Power Plan to slash CO2 emissions from existing coal-fired power plants 30 percent below 2005 levels by 2030.

The study says that despite international efforts to reduce CO2 emissions, the global power sector’s CO2 commitments are growing 4 percent each year, and have not declined at all since 1950.

As developing nations like China and India and other countries become more industrialized and build more and more coal-fired power plants — China and India account for more than half of all the coal used on the planet — the world is being committed to more and more CO2 emissions in the coming years.

“Remaining commitments have gotten bigger and bigger every year without exception,” Davis said. “We’re not at the point where power plants alone will emit 30 billion tons if they run 40 years.”

Damon Matthews, Concordia University chair in climate science and sustainability who reviewed the study prior to publication, said the study is a new way of thinking about power plant emissions.

“If we can account for committed emissions over a lifetime of a plant at the time it is built, this may change the equation about what type of power plants it makes sense to invest in,” Matthews said.

Stephane Hallegatte, senior economist in the Climate Change Group at the World Bank and a reviewer of the study prior to publication, said the study is crucial because it creates an indicator to help policymakers understand the long-term consequences of their decisions.

“Indeed, the problem is that we have invested and continue to invest in infrastructure and equipment — including power plants — that emit and will emit for a long time,” Hallegatte said. “Because of the long lifetime of these investments, reducing emissions in 2030 requires an action that starts as soon as possible.”

Accounting for future CO2 pollution commitment is critical for policymakers and the power sector to better understand their role in a changing climate and what can be done to reduce CO2 emissions globally, the study says.

It’s possible some of power plants may not be used for their life expectancy, but that’s a rare occurrence, Socolow said.

“Of course, we can retire plants before the end of their natural lifetime or retrofit them with new technology,” Matthews said. “But this is expensive to do, so we can’t assume that will happen.”

Socolow said one of the things he hopes the paper will do is prod the UN reporting system to account for future emissions. The electric power industry has no good data on emissions, and emission estimates reported to governments are usually based on the amount of coal bought and sold rather than measurements of actual emissions at power plants, he said.

“The result of this paper’s analysis — namely the rapid increase in committed emissions — shows that actions to direct new investments toward cleaner technologies are even more urgent than what emissions alone suggest,” Hallegatte said.

Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1182 on: Aug 29, 2014, 07:12 AM »

Southeast Louisiana is being devoured by the sea — and it’s going to get worse, even quicker

By Bob Marshall, The Lens, Brian Jacobs and Al Shaw, ProPublica
Thursday, August 28, 2014 15:36 EDT

In just 80 years, some 2,000 square miles of its coastal landscape have turned to open water, wiping places off maps, bringing the Gulf of Mexico to the back door of New Orleans and posing a lethal threat to an energy and shipping corridor vital to the nation’s economy.

And it’s going to get worse, even quicker.

Scientists now say one of the greatest environmental and economic disasters in the nation’s history is rushing toward a catastrophic conclusion over the next 50 years, so far unabated and largely unnoticed.

At the current rates that the sea is rising and land is sinking, National Oceanic and Atmospheric Administration scientists say by 2100 the Gulf of Mexico could rise as much as 4.3 feet across this landscape, which has an average elevation of about 3 feet. If that happens, everything outside the protective levees — most of Southeast Louisiana — would be underwater.

The effects would be felt far beyond bayou country. The region best known for its self-proclaimed motto “laissez les bons temps rouler” — let the good times roll — is one of the nation’s economic linchpins.

This land being swallowed by the Gulf is home to half of the country’s oil refineries, a matrix of pipelines that serve 90 percent of the nation’s offshore energy production and 30 percent of its total oil and gas supply, a port vital to 31 states, and 2 million people who would need to find other places to live.

The landscape on which all that is built is washing away at a rate of a football field every hour, 16 square miles per year.

For years, most residents didn’t notice because they live inside the levees and seldom travel into the wetlands. But even those who work or play in the marshes were misled for decades by the gradual changes in the landscape. A point of land eroding here, a bayou widening there, a spoil levee sinking a foot over 10 years. In an ecosystem covering thousands of square miles, those losses seemed insignificant. There always seemed to be so much left.

Now locals are trying to deal with the shock of losing places they had known all their lives — fishing camps, cypress swamps, beachfronts, even cattle pastures and backyards — with more disappearing every day.

Fishing guide Ryan Lambert is one of them. When he started fishing the wetlands out of Buras 34 years ago, he had to travel through six miles of healthy marshes, swamps and small bays to reach the Gulf of Mexico.

“Now it’s all open water,” Lambert said. “You can stand on the dock and see the Gulf.”

Two years ago, NOAA removed 31 bays and other features from the Buras charts. Some had been named by French explorers in the 1700s.

The people who knew this land when it was rich with wildlife and dotted with Spanish- and French-speaking villages are getting old. They say their grandchildren don’t understand what has been lost.

“I see what was,” said Lloyd “Wimpy” Serigne, who grew up in the fishing and trapping village of Delacroix, 20 miles southeast of New Orleans. It was once home to 700 people; now there are fewer than 15 permanent residents. “People today — like my nephew, he’s pretty young — he sees what is.”

If this trend is not reversed, a wetlands ecosystem that took nature 7,000 years to build will be destroyed in a human lifetime.

The story of how that happened is a tale of levees, oil wells and canals leading to destruction on a scale almost too big to comprehend — and perhaps too late to rebuild. It includes chapters on ignorance, unintended consequences and disregard for scientific warnings. It’s a story that is still unfolding.
Speck by speck, land built over centuries

The coastal landscape Europeans found when they arrived at the mouth of the Mississippi River 500 years ago was the Amazon of North America, a wetlands ecosystem of more than 6,000 square miles built by one of the largest rivers in the world.

For thousands of years, runoff from the vast stretch of the continent between the Rockies and the Appalachians had flowed into the Mississippi valley. Meltwater from retreating glaciers, seasonal snowfall and rain carried topsoil and sand from as far away as the Canadian prairies. The river swelled as it rushed southward on the continent’s downward slope, toward the depression in the planet that would become known as the Gulf of Mexico.

Down on the flat coastal plain, the giant river slowed. It lost the power to carry those countless tons of sediment, which drifted to the bottom. Over thousands of years, this rain of fine particles gradually built land that would rise above the Gulf.

It wasn’t just the main stem of the Mississippi doing this work. When the river reached the coastal plain, side channels — smaller rivers and bayous — peeled off. They were called “œdistributaries,” for the job they did spreading that land-building sediment ever farther afield.

The delta had two other means of staying above the Gulf. The plants and trees growing in its marshes and swamps shed tons of dead parts each year, adding to the soil base. Meanwhile, storms and high tides carried sediment that had been deposited offshore back into the wetlands.

As long as all this could continue unobstructed, the delta continued to expand. But with any interruption, such as a prolonged drought, the new land began to sink.

That’s because the sheer weight of hundreds of feet of moist soil is always pushing downward against the bedrock below. Like a sponge pressed against a countertop, the soil compresses as the moisture is squeezed out. Without new layers of sediment, the delta eventually sinks below sea level.

The best evidence of this dependable rhythm of land building and sinking over seven millennia is underground. Geologists estimate that the deposits were at least 400 feet deep at the mouth of the Mississippi when those first Europeans arrived.

By the time New Orleans was founded in 1718, the main channel of the river was the beating heart of a system pumping sediment and nutrients through a vast circulatory network that stretched from present-day Baton Rouge south to Grand Isle, west to Texas and east to Mississippi. As late as 1900, new land was pushing out into the Gulf of Mexico.

A scant 70 years later, that huge, vibrant wetlands ecosystem would be at death’s door. The exquisite natural plumbing that made it all possible had been dismantled, piece by piece, to protect coastal communities and extract oil and gas.

Engineering the river

For communities along its banks, the Mississippi River has always been an indispensable asset and their gravest threat. The river connected their economies to the rest of the world, but its spring floods periodically breached locally built levees, quickly washing away years of profits and scores of lives. Some towns were so dependent on the river, they simply got used to rebuilding.

That all changed with the Great Flood of 1927.

Swollen by months of record rainfall across the watershed, the Mississippi broke through levees in 145 places, flooding the midsection of the country from Illinois to New Orleans. Some 27,000 square miles went under as much as 30 feet of water, destroying 130,000 homes, leaving 600,000 people homeless and killing 500.

Stunned by what was then the worst natural disaster in U.S. history, Congress passed the Flood Control Act of 1928, which ordered the U.S. Army Corps of Engineers to prevent such a flood from ever happening again. By the mid-1930s, the corps had done its job, putting the river in a straitjacket of levees.

But the project that made the river safe for the communities along the river would eventually squeeze the life out of the delta. The mud walls along the river sealed it off from the landscape sustained by its sediment. Without it, the sinking of land that only occurred during dry cycles would start, and never stop.

If that were all we had done to the delta, scientists have said, the wetlands that existed in the 1930s could largely be intact today. The natural pace of sinking — scientists call it subsidence — would have been mere millimeters per year.

But we didn’t stop there. Just as those levees were built, a nascent oil and gas industry discovered plentiful reserves below the delta’s marshes, swamps and ridges.

At the time, wetlands were widely considered worthless — places that produced only mosquitoes, snakes and alligators. The marsh was a wilderness where few people could live, or even wanted to.

There were no laws protecting wetlands. Besides, more than 80 percent of this land was in the hands of private landowners who were happy to earn a fortune from worthless property.

Free to choose the cheapest, most direct way to reach drilling sites, oil companies dredged canals off natural waterways to transport rigs and work crews. The canals averaged 13 to 16 feet deep and 140 to 150 feet wide — far larger than natural, twisting waterways.

Effects of canals ripple across the wetlands

Eventually, some 50,000 wells were permitted in the coastal zone. The state estimates that roughly 10,000 miles of canals were dredged to service them, although that only accounts for those covered by permitting systems. The state began to require some permits in the 1950s, but rigorous accounting didn’t begin until the Clean Water Act brought federal agencies into play in 1972.

Researchers say the total number of miles dredged will never be known because many of those areas are now underwater. Gene Turner, a Louisiana State University professor who has spent years researching the impacts of the canals, said 10,000 miles “would be a conservative estimate.”

Companies drilled and dredged all over the coast, perhaps nowhere more quickly than the area near Lafitte, which became known as the Texaco Canals.

This fishing village 15 miles south of New Orleans had been named for the pirate who used these bayous to ferry contraband to the city. For years, the seafood, waterfowl and furbearers in the surrounding wetlands sustained the community. As New Orleans grew, Lafitte also became a favorite destination for weekend hunters and anglers.

Today those scenes are only a memory.

“Once the oil companies come in and started dredging all the canals, everything just started falling apart,” said Joseph Bourgeois, 84, who grew up and still lives in the area.

From 1930 to 1990, as much as 16 percent of the wetlands was turned to open water as those canals were dredged. But as the U.S. Department of the Interior and many others have reported, the indirect damages far exceeded that:

    Saltwater creeped inCanal systems leading to the Gulf allowed saltwater into the heart of freshwater marshes and swamps, killing plants and trees whose roots held the soils together. As a side effect, the annual supply of plant detritus — one way a delta disconnected from its river can maintain its elevation — was seriously reduced.

    Shorelines crumbledWithout fresh sediment and dead plants, shorelines began to collapse, increasing the size of existing water bodies. Wind gained strength over ever-larger sections of open water, adding to land loss. Fishers and other boaters used canals as shortcuts across the wetlands; their wakes also sped shoreline erosion. In some areas, canals grew twice as wide within five years.

    Spoil levees buried and trapped wetlandsWhen companies dredged canals, they dumped the soil they removed alongside, creating “spoil levees” that could rise higher than 10 feet and twice as wide.The weight of the spoil on the soft, moist delta caused the adjacent marshes to sink. In locations of intense dredging, spoil levees impounded acres of wetlands. The levees also impeded the flow of water — and sediments — over wetlands during storm tides.If there were 10,000 miles of canals, there were 20,000 miles of levees. Researchers estimate that canals and levees eliminated or covered 8 million acres of wetlands.

All this disrupted the delta’s natural hydrology — its circulatory system — and led to the drowning of vast areas. Researchers have shown that land has sunk and wetlands have disappeared the most in areas where canals were concentrated.

In the 1970s, up to 50 square miles of wetlands were disappearing each year in the areas with heaviest oil and gas drilling and dredging, bringing the Gulf within sight of many communities.

As the water expanded, people lived and worked on narrower and narrower slivers of land.

“There’s places where I had cattle pens, and built those pens … with a tractor that weighed 5,000 or 6,000 pounds,” said Earl Armstrong, a cattle rancher who grew on the river nine miles south of the nearest road. “Right now we run through there with airboats.”

There are other forces at work, including a series of geologic faults in the delta and the rock layers beneath, but a U.S. Department of Interior report says oil and gas canals are ultimately responsible for 30 to 59 percent of coastal land loss. In some areas of Barataria Bay, said Turner at LSU, it’s close to 90 percent.

Even more damage was to come as the oil and gas industry shifted offshore in the late 1930s, eventually planting about 7,000 wells in the Gulf. To carry that harvest to onshore refineries, companies needed more underwater pipelines. So they dug wider, deeper waterways to accommodate the large ships that served offshore platforms.

Congress authorized the Corps of Engineers to dredge about 550 miles of navigation channels through the wetlands. The Department of Interior has estimated that those canals, averaging 12 to 15 feet deep and 150 to 500 feet wide, resulted in the loss of an additional 369,000 acres of coastal land.

Researchers eventually would show that the damage wasn’t due to surface activities alone. When all that oil and gas was removed from below some areas, the layers of earth far below compacted and sank. Studies have shown that coastal subsidence has been highest in some areas with the highest rates of extraction.

Push to hold industry accountable

The oil and gas industry, one of the state’s most powerful political forces, has acknowledged some role in the damages, but so far has defeated efforts to force companies to pay for it.

The most aggressive effort to hold the industry accountable is now underway. In July 2013, the Southeast Louisiana Flood Protection Authority-East, which maintains levees around New Orleans, filed suit against more than 90 oil, gas and pipeline companies.

The lawsuit claims that the industry, by transforming so much of the wetlands to open water, has increased the size of storm surges. It argues this is making it harder to protect the New Orleans area against flooding and will force the levee authority to build bigger levees and floodwalls.

The lawsuit also claims that the companies did not return the work areas to their original condition, as required by state permits.

“The oil and gas industry has complied with each permit required by the State of Louisiana and the Corps of Engineers since the permits became law, said Ragan Dickens, spokesman for the Louisiana Oil and Gas Association.

State leaders immediately rose to the industry’s defense. Much of the public debate has not been about the merits of the suit; instead, opponents contested the authority’s legal right to file the suit and its contingency fee arrangement with a private law firm.

“We’re not going to allow a single levee board that has been hijacked by a group of trial lawyers to determine flood protection, coastal restoration and economic repercussions for the entire State of Louisiana,” said Gov. Bobby Jindal in a news release demanding that the levee authority withdraw its suit.

“A better approach,” he said in the statement, “to helping restore Louisiana’s coast includes holding the Army Corps of Engineers accountable, pushing for more offshore revenue sharing and holding BP accountable for the damage their spill is doing to our coast.”

The industry’s political clout reflects its outsized role in the economy of one of the nation’s poorest states. The industry directly employs 63,000 people in the state, according to the federal Department of Labor.

Many of those employees live in the coastal parishes that have suffered most from oil and gas activities and face the most severe consequences from the resulting land loss.

Legislators in those areas helped Jindal pass a law that retroactively sought to remove the levee authority’s standing to file the suit. The constitutionality of that law is now before a federal judge.
Consequences now clear

Even as politicians fought the lawsuit, it was hard to deny what was happening on the ground.

By 2000, coastal roads that had flooded only during major hurricanes were going underwater when high tides coincided with strong southerly winds. Islands and beaches that had been landmarks for lifetimes were gone, lakes had turned into bays, and bays had eaten through their borders to join the Gulf.

“It happened so fast, I could actually see the difference day to day, month to month,” said Lambert, the fishing guide in Buras.

Today, in some basins around New Orleans, land is sinking an inch every 30 months. At this pace, by the end of the century this land will sink almost 3 feet in an area that’s barely above sea level today.

Meanwhile, global warming is causing seas to rise worldwide. Coastal landscapes everywhere are now facing a serious threat, but none more so than Southeast Louisiana.

The federal government projects that seas along the U.S. coastline will rise 1.5 to 4.5 feet by 2100. Southeast Louisiana would see “at least” 4 to 5 feet, said NOAA scientist Tim Osborn.

The difference: This sediment-starved delta is sinking at one of the fastest rates of any large coastal landscape on the planet at the same time the oceans are rising.

Maps used by researchers to illustrate what the state will look like in 2100 under current projections show the bottom of Louisiana’s “boot” outline largely gone, replaced by a coast running practically straight east to west, starting just south of Baton Rouge. The southeast corner of the state is represented only by two fingers of land — the areas along the Mississippi River and Bayou Lafourche that currently are protected by levees.

Finally, a plan to rebuild — but not enough money

Similar predictions had been made for years. But Hurricane Katrina finally galvanized the state Legislature, which pushed through a far-reaching coastal restoration plan in 2007.

The 50-year, $50 billion Master Plan for the Coast (in 2012 dollars) includes projects to build levees, pump sediment into sinking areas, and build massive diversions on the river to reconnect it with the dying delta.

The state’s computer projections show that by 2060 — if projects are completed on schedule — more land could be built annually than is lost to the Gulf.

But there are three large caveats.

    The state is still searching for the full $50 billion. Congress so far has been unwilling to help.
    If the plan is to work, sea-level rise can’t be as bad as the worst-case scenario.
    Building controlled sediment diversions on the river, a key part of the land-building strategy, has never been done before. The predictions, then, are largely hypothetical, although advocates say the concept is being proven by an uncontrolled diversion at West Bay, near the mouth of the river.

Some of the money will come from an increased share of offshore oil and gas royalties, but many coastal advocates say the industry should pay a larger share.

In fact, leaders of the regional levee authority have said the purpose of the lawsuit was to make the industry pay for the rebuilding plan, suggesting that state could trade immunity from future suits for bankrolling it.

That idea is gaining momentum in official circles, despite the industry’s latest win in the state Legislature.

Kyle Graham, executive director of the Louisiana Coastal Protection and Restoration Authority, said recently that the industry understands its liability for the crumbling coast and is discussing some kind of settlement. “It’s very difficult to see a future in which that [such an agreement] isn’t there,” he said.

Graham has said current funding sources could keep the restoration plan on schedule only through 2019. He was blunt when talking about what would happen if more money doesn’t come through: There will be a smaller coast.

“There are various sizes of a sustainable coastal Louisiana,” he said. “And that could depend on how much our people are willing to put up for that.”

A vanishing culture

Trying to keep pace with the vanishing pieces of southeast Louisiana today is like chasing the sunset; it’s a race that never ends.

Lambert said when he’s leading fishing trips, he finds himself explaining to visitors what he means when he says, “This used to be Bay Pomme d’Or” and the growing list of other spots now only on maps.

Signs of the impending death of this delta are there to see for any visitor.

Falling tides carry patches of marsh grass that have fallen from the ever-crumbling shorelines.

Pelicans circle in confusion over nesting islands that have washed away since last spring.

Pilings that held weekend camps surrounded by thick marshes a decade ago stand in open water, hundreds of yards from the nearest land — mute testimony to a vanishing culture.

Shrimpers push their wing nets in lagoons that were land five years ago.

The bare trunks of long-dead oaks rise from the marsh, tombstones marking the drowning of high ridges that were built back when the river pumped life-giving sediment through its delta.

“If you’re a young person you think this is what it’s supposed to look like,” Lambert said. “Then when you’re old enough to know, it’s too late.”

Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1183 on: Aug 29, 2014, 11:17 AM »

Ebola virus evolving — sometimes in a single person — as it spreads across West Africa

By Agence France-Presse
Friday, August 29, 2014 12:49 EDT

Scientists tracking the spread of Ebola across West Africa released Thursday 99 sequenced genomes of the deadly and highly contagious hemorrhagic virus in the hopes the data may accelerate diagnosis and treatment.

As a sign of the urgency and danger at hand, five of the nearly 60 international co-authors who helped collect and analyze the viral samples have died of Ebola already this year, said the report in the journal Science.

More than 1,552 people have been killed and 3,000 infected in Guinea, Sierra Leone, Liberia and Nigeria, according to the World Health Organization’s latest toll.

Never before has there been an Ebola outbreak so large, nor has the virus — which was first detected in 1976 — ever infected people in West Africa until now.

“We’ve uncovered more than 300 genetic clues about what sets this outbreak apart from previous outbreaks,” said Stephen Gire, a research scientist in the Sabeti lab at the Broad Institute and Harvard University.

“Although we don’t know whether these differences are related to the severity of the current outbreak, by sharing these data with the research community we hope to speed up our understanding of this epidemic and support global efforts to contain it.”

- How the virus spreads -

Ebola spreads through close contact with the bodily fluids of infected people when they are showing symptoms, such as fever, vomiting and diarrhea.

It can also be contagious after a person dies, and health officials have warned that touching corpses during funeral rites can be a key route of transmission.

West Africa’s outbreak began in Guinea early this year, then spread to Liberia in March, Sierra Leone in May, and Nigeria in late July.

Researchers took samples of the virus from 78 patients in Sierra Leone during the first few weeks of the outbreak there.

They released 99 sequences, because they sampled some of the patients twice to show how the virus could evolve in a single person.

Scientists focused some of their study on a group of 12 people who were infected in Sierra Leone while attending the funeral of a traditional healer who had come into contact with Ebola patients from Guinea.

Their story was first told to an AFP reporter in Kenema, Sierra Leone last week.

Samples taken from the 12 funeral attendees show that the Sierra Leone outbreak stemmed from two genetically distinct viruses circulating in Guinea at the time, the Science report found.

This suggests “the funeral attendees were most likely infected by two lineages then circulating in Guinea, possibly at the funeral.”

The mourners included a young pregnant woman. Soon after, she was hospitalized with a fever and miscarried, Science said.

She survived, and became Sierra Leone’s first confirmed case of Ebola in the outbreak that swept West Africa.

- Animal host? -

Researchers said they believe the virus spreads via an animal host, possibly a kind of fruit bat that has a natural range from central Africa — where Ebola has caused human outbreaks before — to Guinea in the continent’s far west.

A report in The New England Journal of Medicine in April said the first known case of the West Africa outbreak was believed to be a young child who died December 6, 2013 in Guinea.

Some researchers believe the child may have come into contact with an infected fruit bat, though that theory has not been proven.

There is no drug or vaccine on the market to treat or prevent Ebola.

The first human trials of a potential vaccine are set to begin near the US capital next week.

Logged
Rad
Moderator
Most Active Member
*****
Posts: 22128


« Reply #1184 on: Aug 31, 2014, 07:37 AM »

Prototype for Ebola deterrent drug clears early test hurdle

By Agence France-Presse
Friday, August 29, 2014 20:44 EDT

Paris (AFP) – A prototype drug that has been urgently given to a handful of patients with Ebola has cleared an important test hurdle, showing that it cured lab monkeys with the disease, scientists said Friday.

Normally, experimental drugs are tested first on animals and then on progressively larger groups of humans to ensure they are safe and effective.

But, in an exceptional move, a new drug called ZMapp that has not gone through these tests has been rushed to the outbreak in west Africa, as the lethal disease has no cure.

Reporting online in the British journal Nature, researchers at the Public Health Agency of Canada said 18 rhesus macaque monkeys given high doses of Ebola virus fully recovered after being given ZMapp, even when it was administered five days after infection.

It reversed dangerous symptoms such as bleeding, rashes and high levels of enzymes in the liver.

Three “control” monkeys that had been infected, but not treated, all died within eight days.

The 21 animals had been given the so-called Kikwit strain of Ebola, named after a location in the Democratic Republic of Congo, the country where the haemorrhagic fever was discovered in 1976.

But lab-dish tests indicate it can also inhibit the strain in Guinea which has sparked the current epidemic, the scientists said.

- Good first step -

Independent experts hailed the results as an encouraging first step in the long vetting process.

They added, though, it was still unclear whether ZMapp worked on humans, as two patients who have been given it have died and two others have recovered.

“Widespread availability and use of ZMapp will require human safety testing and licensing, coupled with scaleup of the manufacturing process,” cautioned David Evans, a professor of virology at Britain’s University of Warwick.

A cocktail of three antibodies designed to cling to the Ebola virus and inhibit its reproduction, ZMapp is being developed by Mapp Biopharmaceutical Inc. of San Diego, California, partly in conjunction with the US Army.

ZMapp has so far been given to seven infected frontline workers.

Of these, two American doctors have recovered; a Liberian doctor and a Spanish priest have died; and a doctor and a nurse, both Liberian, and a British nurse, who has been flown to London from Sierra Leone, are still in treatment.

The World Health Organization gave the green light on August 12, saying it was ethical to use experimental drugs in the context of this dangerous epidemic.

Stocks of ZMapp, which is derived from tobacco leaves and is hard to produce on a large scale, are exhausted, the company said on August 12.

The other main experimental drug for the disease is TKM-Ebola, being developed by Tekmira Pharmaceuticals Corp. of Vancouver, Canada, under a $140-million (105-million-euro) contract with the Pentagon.

It is currently in a Phase I human trial, the first step in the three-phase test process. In this phase, a drug is evaluated on healthy non-infected humans to see whether it is safe. Further phases test it for safety and also effectiveness.

More than 1,500 people have died in Guinea, Liberia, Nigeria and Sierra Leone since the disease emerged in West Africa last December.

Logged
Pages: 1 ... 77 78 [79] 80 81 ... 88   Go Up
Print
Jump to:  

Video