RL Macklin's Sustainability & EHS Website

Integrated Strategies for Managing Sustainability & EHS

RL Macklin's Sustainability & EHS Website

Integrated Strategies for Managing Sustainability & EHS

RL Macklin's Sustainability & EHS Website

Integrated Strategies for Managing Sustainability & EHS

RL Macklin's Sustainability & EHS Website

Integrated Strategies for Managing Sustainability & EHS

The Unfortunate Balance Between The Economy and The Environment

A recent 60 second podcast from Scientific American (Will Economic Health Align With Environmental Health?: Scientific American Podcast) reports that the 2009 economic downturn apparently resulted in a reduction in greenhouse gas emissions.  Still 2009 emissions were second only to 2008 as the highest ever.

Based on energy consumption statistics from various nations, scientists estimate that global CO2 emissions dropped by 1.3 percent in 2009. But that still made them the second highest ever—just behind 2008.

The biggest drops were recorded in Japan, the UK and Russia while China, India and South Korea continued to emit more than ever before.

The good news is emissions are back on track this year to increase by more than 3 percent worldwide—a sure indication of economic recovery. And the amount of CO2 emitted for every unit of economic activity is no longer shrinking as fast as it once was, thanks to a new boom in dirty coal.

via Scientific American 60 Second Earth

The podcast raises the question of whether it is possible to have a strong economy without damaging the environment and conversely whether environmental protection comes at the expense of the economy.

In reality that is the challenge of sustainability.  How do companies, nations and consumers have sustainable practices and remain economically sound.  These are the earliest days of the sustainability movement and it is likely that the technologies to really answer this don’t exist yet.  It is not unlike the emergence of our current system of distributing electricity.  The first electrical distribution systems in the US were based on direct current (DC).

Each voltage level of DC power must be distributed across dedicated power lines.  So a modern home would need to bring in separate power lines for 100v -110v uses like lighting, television, etc and separate lines for higher voltage requirements like refrigerators, electric clothes dryers and stoves, etc. There is also a loss of voltage with distance therefore with DC it is necessary for the power generation point to be close to the power consumption point.  AC power though distributed at a single voltage can be converted to other voltages through the use of transformers.  Transformers can both increase and decrease voltage making it possible to correct for voltage loss due to transmission distance and to provide multiple voltage levels at the point of consumption through a single connection to the power grid.

After proving in 1896 AC could power industry in the Buffalo area using power generated by Niagara Falls alternating current (AC) power distribution became standard.  Enough DC power had been installed in the early days of power transmission that despite AC being a superior technology for long distance electricity transmission there were DC distribution systems in portions of major cities, including New York and Boston well into the 20th century.  As well a number of industrial and commercial installation existed that still relied on DC power.

Imagine how different our world would be if the AC versus DC battle had played out differently.  It is unlikely that the limitation poised by alternate energy sources today will remain in place over time.  To really start to tackle the imbalance between the economy and the environment we will need to focus our efforts on developing practical alternate energy sources, that don’t result in increased environmental harm.

Comments (3)

Did BP Get VOOM?

As children many of us read Dr. Suess’s the Cat in the Hat Comes Back where we were introduced to VOOM.  For those who didn’t read it or don’t remember after wrecking the bathroom with a pink ring in the tub the Cat keeps calling on more Little Cat Assistants to help clean up the pink mess.  Each assistant makes matters worse until the whole house, even the snow outside, is covered in pink.  In fact, one might say that the pink mess was dispersed all over the place.  Then the last and smallest assistant, Little Cat Z, uses VOOM and just like that the mess is cleaned up and all of the assistants are blown back under the Cat’s hat.

So perhaps you are wondering why I am relating VOOM to the BP Gulf oil spill?  As the world watched over the spring and summer BP tried one failed attempt after another to stop the flow of oil from their blown well in the Gulf.  I could not help but be reminded of the Cat’s assistants, spreading the mess.  Then on July 15th, BP got a cap in place and the flow stopped.  We were warned by the government and the press that this was not the final solution.  However, for all intents and purposes, particularly for keeping a spotlight on BP, it was the final solution.

Quietly, and without the attention from the media, that you might have expected given the magnitude of the spill, on September 19th BP successfully completed the permanent closure of the well.  The flow of oil from the well had stopped on July 15.  In the intervening weeks between mid July and mid September the attention of the national media and with it the nation turned to other things.  There was no more spectacle, no more 24 hour a day video of oil steadily pouring out from the damaged well.

However, a looming and hotly debated question since has been, where did all the oil go?  Scientist and experts are very divided on the answer.  The US government contends that the majority of the oil is gone.  They provide the following estimates:

  • 33% of the oil was recovered, burned or dispersed
  • 25% of the oil naturally evaporated
  • 16% of the oil broke down naturally
  • 26% of the oil is just below the surface or washed up on shore

deepwater_horizon_oil_budget.png

Figure 1 is an excerpt from the NOAA report “BP Deepwater Horizon Oil Budget: What Happened To the Oil?”, where additional details can be found.

Yet, to say the oil is dispersed, whether naturally or through the use of dispersant, doesn’t really answer the question of “Where did it go?”.

Some point to rapid degradation by natural oil consuming microorganisms.  It makes a kind of logical sense that in a region subject regularly to natural oil seepage there would be an abundance of natural organisms capable of breaking down oil.  One of the fundamental principles of bioremediation is to enhance the environment to support the activity of naturally occurring microorganisms, rather than trying to introduce tailored organisms to consume contaminants.  However, the natural environment in the Gulf was not enhanced.

Concern was raised that when combined with the expected seasonal increase in biological activity in the Gulf, increased activity of petroleum consuming microbes would result in a large area of water with depleted oxygen.  When measurements were taken the expected zone of depleted oxygen was not found.  So what does that mean?  Some postulate, there is insufficient depletion of oxygen to indicate the level of increased biological activity that would accompany the break down of the amount of oil spilled in the Gulf.  Others have speculated that because of the dispersants the biological activity was less concentrated and therefore still occurred but without depleting the oxygen.

In August researchers from University of South Florida found significant deposits of crude along the seabed floor. Additionally, they discovered microorganisms in the area had been affected by the oil.

Subsequently, in September researchers from the University of Georgia also found areas of seabed heavily impacted by oil (read the AP press release “Where’s the Oil? On the Gulf Floor Scientist Say”).  A third research team, from Texas A&M, also reported finding oil from the Macondo well 3000 feet below the ocean surface, 300 miles away from the well. This raised increased skepticism concerning the governments claims regarding the fate of the released oil.  Many  believe a much larger quantity of oil than estimated sunk, as a result of the depth of the initial release and the heavy use of dispersants.  There is concern that oil on the seabed floor has the potential to impact many species via the food chain. Many of the lower level species in the food chain originate on the seabed. Plans are underway to study this further.

Many have pointed to learnings from the Exxon Valdez release.  One of the main points made is, still some 20 years later, residual oil can easily be found on the shores around Prince William Sound.  In most instances you have only to overturn rocks or dig into the sand and sediment a few inches to uncover this oil. In some places the weathered oil is plainly visible without digging or overturning anything. The second point is that some of the significant species impacts were not seen for up to 3 years after the spill.   In order to determine species damage more than one breeding cycle may be necessary.  The Pacific Herring disappeared from Prince William Sound three years following the Valdez spill.  Evidence, is not conclusive that the disappearance is related to the spill but most have concluded that it is related.  Certainly, the conditions in Alaska were and are different than in the Gulf.  The Valdez release was a surface spill into a sheltered body of water and the normally cold Alaskan weather conditions are not conducive to the type of natural pollutant degradation expected in the Gulf.

What does seem to be the case however is, at this juncture, no one is certain where the oil from the Gulf spill went.  In fact, there isn’t even  full agreement on the quantity of oil released.  Though, there is emerging consensus that at least 4.4 million barrels (205 million gallons) were released.  The area, both the environment and the ecology, will need to be studied for some years to come to really assess where the oil went and what level of natural resources were damaged by the spill.  I suspect that at the conclusion of those studies BP will not actually have found VOOM.

Comments (10)

EPA Adds TSCA Info in EnviroFacts DB

In March of this year EPA committed to giving the public wider access to the data collected under the Toxics Substance Control Act.  This action was consistent with EPA Administrator Lisa P Jackson’s commitment to increase the public’s access to chemical information.  According to Steve Ownes, assistant administrator for EPA’s Office of Prevention, Pesticides and Toxic Substances “Increasing the public’s access to information on chemiclas is one of Administrator Jackson’s top priorities.  The American people are entitled to easily accessible information on chemicals.”

Prior to March 2010 the consolidated public portion of the TSCA Inventory  was only available for a fee. In March the public was given free access to the data on EPA’s website as well as on Data.gov.  Data.gov is part of the Obama administration ongoing efforts at providing more public access to data collected and maintained by the Federal government.

On May 17, 2010 EPA announced that it would also add information on 6,300 chemicals and 3,800 chemical facilities  to its Envirofacts database and website.  Historic information will be added for another 2,500 facilities.  Envirofacts is EPA’s single point of access on the Internet for information about environmental activities that may affect air, water and land in the U.S and provides tools for analyzing the data. It includes facility name and address information, aerial image of the facility and surrounding area, map location of the facility, and links to other EPA information on the facility, such as EPA’s inspection and compliance reports that are available through the Enforcement Compliance History Online (ECHO) database.

For more information see

US EPA News Release - EPA Makes Chemical Information More Accessible to Public For the first time, TSCA chemical inventory free of charge online

US EPA News Release - EPA Adds More Than 6,300 Chemicals and 3,800 Chemical Facilities to Public Database Unprecedented access provided for the first time?

 

 

 


Comments (10)

Six Months Later and Fallout From Climategate Continues

Both sides in the controversy over whether global warming is real, or if real can be attributed to anthropogenic activity continue to defend their positions in light of November’s “climategate”.  To recap in November there was an unauthorized release of computer documents and emails from the Climate Research Unit of the University of East Anglia.  The majority of the hacked (stolen) material were emails. TMost were fairly routine and mundane exchanges of information, questions etc. among climate change scientist.  Those involved were all prominent British and American climate change scientist.

The documents that weren’t just routine exchanges however, include ones that were dismissive and/or demeaning of climate change detractors, discuss ways to present data so that gaps and contradictory data are masked.  The most troubling were correspondence in which the scientist discuss and try to come up with an explanation for an apparent failure of the climate change models.  In the short term temperatures have decreased slightly rather than increased.

Additionally, the University admitted that during the 1980′s much of the actual raw data, stored on paper and magnetic tapes was dumped when the CRU was moved from one location to another.

An inquiry into the actions of the British scientist has cleared them of wrong doing.  That the inquiry was initiated by the University has caused some to question whether the findings are truly impartial.

Scientist not directly implicated in the controversy have recently declared “enough is enough”.  They are quick to point out that other data exists to support the claim that anthropogenic?? global warming is taking place. The data they refer to is not related to nor based on the data from East Anglia.

An essay written in response to the continuing controversy was published in the May 7th edition of the journal Science was signed by 255 members of the U.S. National Academy of Sciences, including  11 Nobel laureates?.  The essay states very plainly the signatories continued belief that global warming is real, a significant threat to the planet and the direct result of human activity. The essay further chastises politicians who have used the recent developments to further their political causes and bolster their stand against climate change legislation.  The full text of the essay can be found here.  As several have noted it is virtually impossible to get 255 members of the US National Academy of Sciences to agree on anything.

However, the climate change detractors are quick to point out that they have a number of unfulfilled requests for data, records and information filed under the Freedom Of Information Act (FOIA).  Some of the requests, made to NASA are over two years old.  The statute requires the material by within 20 days of the request.  NASA has responded that the volume of data is overwhelming and they are doing their best to gather and supply it.  What records they have turned over have been heavily redacted.

So after months the issue of the data supporting climate change remains somewhat murky.  What is evident is that the scientist researching climate change need to be more transparent.  The transparency is needed not just in response to increased volume of their critics.  The lack of transparency raises suspicion that “they” are hiding something.  It makes it even more difficult to pass the very legislation they themselves advocate.  If there are gaps in the data and other anomalies then theories about what they mean should be given.  Mistakes, if they have been made should be openly admitted.  This discourse however, unpleasant for the scientist is part of the job.  Further discussion on the need for transparency can be found here


Comments (4)

Historic Tradition, New Technology and E-Waste Come Together in 2010 Winter Olympics

In a year of several firsts for Olympic medals the Royal Canadian Mint brought together traditions of Native People, modern technology  and e-waste recycling to create a set of unique medals for the 2010 Winter Olympics and Paralymics.

In honor of of the Four Host First Nations people on whose territory the Games are being held, the athlete medals were struck with compelling symbols of West Coast native art.  The Vancouver 2010 medals are based on two large master artworks of an orca whale (Olympic) and raven (Paralympic) by Corrine Hunt, a Canadian designer/artist of Komoyue and Tlingit heritage based in Vancouver, BC.

The design on each medal represents a section hand cropped from the original artwork.  In total 615 unique medals for the Olympics and 399 for the Paralymics were created.  All the metal for the medals was supplied by Teck Resources.  Teck Resources supplied 2.05 kilograms of gold, 1,950 kilograms of silver and 903 kg of copper for the medals.  A small percentage of the total  gold, silver and copper used was recovered from 6.8 metric tons of circuit boards, headed for landfill.  In total  1.52 percent of the gold, 0.122 percent of the  silver, and 1.11 percent of the copper supplied by Teck Resources came from end-of-life  circuit boards. The boards were shredded, the recoverable material separated and melted to recover the valuable material.

The undulating surface of the medals intended to evoke thoughts of the sea, snow and natural environment of British Columbia was also a first.  Production of the surface required the use of 12 separate computer cut and milled dies.  The Royal Canadian Mint has a website dedicated to telling the story of the medals creation, with more about the manufacturing processes and the artisans involved in creating the medals.

Problems stemming from e-waste are explored further in this paper; Could Your Recycling Program Lead to Charges of Greenwashing or Worse — Hidden Markets in Waste Recycling

Comments (9)
Powered by WordPress