A recent 60 second podcast from Scientific American (Will Economic Health Align With Environmental Health?: Scientific American Podcast) reports that the 2009 economic downturn apparently resulted in a reduction in greenhouse gas emissions. Still 2009 emissions were second only to 2008 as the highest ever.
Based on energy consumption statistics from various nations, scientists estimate that global CO2 emissions dropped by 1.3 percent in 2009. But that still made them the second highest ever—just behind 2008.
The biggest drops were recorded in Japan, the UK and Russia while China, India and South Korea continued to emit more than ever before.
The good news is emissions are back on track this year to increase by more than 3 percent worldwide—a sure indication of economic recovery. And the amount of CO2 emitted for every unit of economic activity is no longer shrinking as fast as it once was, thanks to a new boom in dirty coal.
The podcast raises the question of whether it is possible to have a strong economy without damaging the environment and conversely whether environmental protection comes at the expense of the economy.
In reality that is the challenge of sustainability. How do companies, nations and consumers have sustainable practices and remain economically sound. These are the earliest days of the sustainability movement and it is likely that the technologies to really answer this don’t exist yet. It is not unlike the emergence of our current system of distributing electricity. The first electrical distribution systems in the US were based on direct current (DC).
Each voltage level of DC power must be distributed across dedicated power lines. So a modern home would need to bring in separate power lines for 100v -110v uses like lighting, television, etc and separate lines for higher voltage requirements like refrigerators, electric clothes dryers and stoves, etc. There is also a loss of voltage with distance therefore with DC it is necessary for the power generation point to be close to the power consumption point. AC power though distributed at a single voltage can be converted to other voltages through the use of transformers. Transformers can both increase and decrease voltage making it possible to correct for voltage loss due to transmission distance and to provide multiple voltage levels at the point of consumption through a single connection to the power grid.
After proving in 1896 AC could power industry in the Buffalo area using power generated by Niagara Falls alternating current (AC) power distribution became standard. Enough DC power had been installed in the early days of power transmission that despite AC being a superior technology for long distance electricity transmission there were DC distribution systems in portions of major cities, including New York and Boston well into the 20th century. As well a number of industrial and commercial installation existed that still relied on DC power.
Imagine how different our world would be if the AC versus DC battle had played out differently. It is unlikely that the limitation poised by alternate energy sources today will remain in place over time. To really start to tackle the imbalance between the economy and the environment we will need to focus our efforts on developing practical alternate energy sources, that don’t result in increased environmental harm.
As children many of us read Dr. Suess’s the Cat in the Hat Comes Back where we were introduced to VOOM. For those who didn’t read it or don’t remember after wrecking the bathroom with a pink ring in the tub the Cat keeps calling on more Little Cat Assistants to help clean up the pink mess. Each assistant makes matters worse until the whole house, even the snow outside, is covered in pink. In fact, one might say that the pink mess was dispersed all over the place. Then the last and smallest assistant, Little Cat Z, uses VOOM and just like that the mess is cleaned up and all of the assistants are blown back under the Cat’s hat.
So perhaps you are wondering why I am relating VOOM to the BP Gulf oil spill? As the world watched over the spring and summer BP tried one failed attempt after another to stop the flow of oil from their blown well in the Gulf. I could not help but be reminded of the Cat’s assistants, spreading the mess. Then on July 15th, BP got a cap in place and the flow stopped. We were warned by the government and the press that this was not the final solution. However, for all intents and purposes, particularly for keeping a spotlight on BP, it was the final solution.
Quietly, and without the attention from the media, that you might have expected given the magnitude of the spill, on September 19th BP successfully completed the permanent closure of the well. The flow of oil from the well had stopped on July 15. In the intervening weeks between mid July and mid September the attention of the national media and with it the nation turned to other things. There was no more spectacle, no more 24 hour a day video of oil steadily pouring out from the damaged well.
However, a looming and hotly debated question since has been, where did all the oil go? Scientist and experts are very divided on the answer. The US government contends that the majority of the oil is gone. They provide the following estimates:
- 33% of the oil was recovered, burned or dispersed
- 25% of the oil naturally evaporated
- 16% of the oil broke down naturally
- 26% of the oil is just below the surface or washed up on shore
Figure 1 is an excerpt from the NOAA report “BP Deepwater Horizon Oil Budget: What Happened To the Oil?”, where additional details can be found.
Yet, to say the oil is dispersed, whether naturally or through the use of dispersant, doesn’t really answer the question of “Where did it go?”.
Some point to rapid degradation by natural oil consuming microorganisms. It makes a kind of logical sense that in a region subject regularly to natural oil seepage there would be an abundance of natural organisms capable of breaking down oil. One of the fundamental principles of bioremediation is to enhance the environment to support the activity of naturally occurring microorganisms, rather than trying to introduce tailored organisms to consume contaminants. However, the natural environment in the Gulf was not enhanced.
Concern was raised that when combined with the expected seasonal increase in biological activity in the Gulf, increased activity of petroleum consuming microbes would result in a large area of water with depleted oxygen. When measurements were taken the expected zone of depleted oxygen was not found. So what does that mean? Some postulate, there is insufficient depletion of oxygen to indicate the level of increased biological activity that would accompany the break down of the amount of oil spilled in the Gulf. Others have speculated that because of the dispersants the biological activity was less concentrated and therefore still occurred but without depleting the oxygen.
In August researchers from University of South Florida found significant deposits of crude along the seabed floor. Additionally, they discovered microorganisms in the area had been affected by the oil.
Subsequently, in September researchers from the University of Georgia also found areas of seabed heavily impacted by oil (read the AP press release “Where’s the Oil? On the Gulf Floor Scientist Say”). A third research team, from Texas A&M, also reported finding oil from the Macondo well 3000 feet below the ocean surface, 300 miles away from the well. This raised increased skepticism concerning the governments claims regarding the fate of the released oil. Many believe a much larger quantity of oil than estimated sunk, as a result of the depth of the initial release and the heavy use of dispersants. There is concern that oil on the seabed floor has the potential to impact many species via the food chain. Many of the lower level species in the food chain originate on the seabed. Plans are underway to study this further.
Many have pointed to learnings from the Exxon Valdez release. One of the main points made is, still some 20 years later, residual oil can easily be found on the shores around Prince William Sound. In most instances you have only to overturn rocks or dig into the sand and sediment a few inches to uncover this oil. In some places the weathered oil is plainly visible without digging or overturning anything. The second point is that some of the significant species impacts were not seen for up to 3 years after the spill. In order to determine species damage more than one breeding cycle may be necessary. The Pacific Herring disappeared from Prince William Sound three years following the Valdez spill. Evidence, is not conclusive that the disappearance is related to the spill but most have concluded that it is related. Certainly, the conditions in Alaska were and are different than in the Gulf. The Valdez release was a surface spill into a sheltered body of water and the normally cold Alaskan weather conditions are not conducive to the type of natural pollutant degradation expected in the Gulf.
What does seem to be the case however is, at this juncture, no one is certain where the oil from the Gulf spill went. In fact, there isn’t even full agreement on the quantity of oil released. Though, there is emerging consensus that at least 4.4 million barrels (205 million gallons) were released. The area, both the environment and the ecology, will need to be studied for some years to come to really assess where the oil went and what level of natural resources were damaged by the spill. I suspect that at the conclusion of those studies BP will not actually have found VOOM.
In March of this year EPA committed to giving the public wider access to the data collected under the Toxics Substance Control Act. This action was consistent with EPA Administrator Lisa P Jackson’s commitment to increase the public’s access to chemical information. According to Steve Ownes, assistant administrator for EPA’s Office of Prevention, Pesticides and Toxic Substances “Increasing the public’s access to information on chemiclas is one of Administrator Jackson’s top priorities. The American people are entitled to easily accessible information on chemicals.”
Prior to March 2010 the consolidated public portion of the TSCA Inventory was only available for a fee. In March the public was given free access to the data on EPA’s website as well as on Data.gov. Data.gov is part of the Obama administration ongoing efforts at providing more public access to data collected and maintained by the Federal government.
On May 17, 2010 EPA announced that it would also add information on 6,300 chemicals and 3,800 chemical facilities to its Envirofacts database and website. Historic information will be added for another 2,500 facilities. Envirofacts is EPA’s single point of access on the Internet for information about environmental activities that may affect air, water and land in the U.S and provides tools for analyzing the data. It includes facility name and address information, aerial image of the facility and surrounding area, map location of the facility, and links to other EPA information on the facility, such as EPA’s inspection and compliance reports that are available through the Enforcement Compliance History Online (ECHO) database.
For more information see