Are you Drowning in Environmental Data While Starving for Information?

By Norman S. Wei, Courtesy of Environmental Management and Training, LLC.

MOUNTAIN VIEW, Calif., 1 January 2020 — A recent article in the Harvard Business Review entitled “IT Doesn’t Matter” generated a lot of discussions and an enormous amount of controversy. IT stands for Information Technology. The premise of the article is that information technology is becoming less relevant as a strategic management tool because it has become more accessible and affordable to all. The author argues that since every corporation is deep into IT, no one company can gain any significant strategic advantage by embracing it. In other words, IT has become a commodity. It has now become a necessary but insufficient condition for excellence.

A natural outcome of the Information Technology age is the ease of collecting massive amount of data. As a result, we often find ourselves drowning in data while starving for information! This column discusses some practical ways of managing your environmental data and how best to get the most information out of such data.

Collecting data is relatively easy to do but it does come with costs. Mistakes in data collection can lead to disastrous results. A classic example of data collection run amok is Total Quality Management (TQM) – an excellent management concept of “doing it right the first time”. In the 1990s, TQM was all the rage. Large and small corporations were embracing it. Unfortunately, many TQM administrators began to demand their employees to fill out forms to document and justify every decision as part of the TQM process. It got to the point where employees were spending so much time in filling out forms and preparing internal reports that the system collapsed on its own paper weight. People were collecting data for the sake of collecting data and not analyzing the data to improve management practices. They were drowning in data but starving for information. Once senior management discovered such waste of resources, TQM program was shut down.

Getting Information out of Data

About ten years ago, the consulting firm I worked for got a multi-million dollar contract to work on a large Superfund site in California . The previous consulting firm had installed about 100 monitor wells on site and were collecting massive amount of groundwater quality data. There were reams of computer printout data showing waste solvent concentrations at varying depths at each well over an extended time period. Yet no one had ever bothered to sit down and analyze the data as they were being collected to determine their efficacy. The data kept rolling in from the field. The Superfund site was drowning in data but starving for information. Our firm was then hired to make sense of the data – to interpret the groundwater contamination data to see in which direction the TCE was migrating in the aquifer. Our job was to provide our clients with useful information so that they could go and negotiate a settlement with EPA on how best to remediate the site. It became clear later the previous firm kept on collecting data. It was much more profitable (translation: more billable hours) to have a large team of technicians out in the field generating data than to have a small team of scientists analyzing them.

There is certainly no shortage of data in the environmental field. Every environmental permit requires the permit holder to collect one form of data or another. If you have a wastewater discharge permit, chances are that you are required by law to collect daily flow data, weekly effluent concentrations and calculate monthly averages of various chemical constituents in your waste stream. If you generate hazardous wastes, you are required by law to record how much of what wastes you ship out to a Treatment Storage and Disposal Facility (TSDF). If you have a major air permit, the government requires you to collect data on how many tons of HAPs (hazardous air pollutants) you send up the stack. Every July 1 of each year, many facilities are required by federal law to tell EPA how much chemicals they used, processes or manufactured in the preceding year and how much of the chemicals were “released” to the environment. This is the so-called Toxic Release Inventory Report (or better known as Form R).

The questions facing us are: How can we make the best use of the data we are required by law to collect? How can we benefit from such data? What useful information can we get out of these data?

Let’s start with the often misused (by the environmental zealots) and misunderstood (by the media) TRI database. A great majority of the data contained in the TRI database is based on legally permitted emissions and/or recycling activities. Yet they are all generically termed as “release to the environment”. That’s why every year the public reads about XYZ company being the “largest” polluter in the county because it “releases” so many tons of toxic chemicals to the community. Leaving this injustice behind, you should review the raw data that you use to compile your TRI report and determine how effectively you have been recycling your wastes offsite and how well you have controlled your air emissions. Use your own TRI data as a basis for your internal audit or review of your operations. If you are planning on acquiring a company, the EPA’s TRI data website address is www.epa.gov. is a good place to start your due diligence.

The EPA also compiles a large database on the hazardous waste activities nationwide based on the hazardous wastes generation reports that all large quantity generators send in every even-numbered year. These biennial reports – downloadable from the EPA’s website – can tell you a lot of information.
The latest available database is for year 1999 and can be downloaded from www.epa.gov. It can tell you who have been shipping what type of hazardous wastes to which TSDF. Once you have downloaded the data into your computer, you can import it into a relational database program such as Microsoft Access and analyze the data. One of the main reasons why you want to know who ship what wastes to a site that you are considering is to avoid shipping wastes to a site that receives most of its wastes from small companies. If the site turns into a Superfund site and those small companies go out of business, your company will be forced to bear their shares of the cleanup cost under the “joint and several liability” clause of CERCLA. The joint and several liability means each PRP (Potentially Responsible Party) associated with the Superfund site can be held individually responsible for the entire cleanup cost of the site. So that’s why you need to choose a site that has many financially viable PRPs in order to minimize your potential liability.

Use the air emission data you collect under your air permit to tell you how much VOCs (Volatile Organic Compounds) and HAPs (Hazardous Air Pollutants) are being emitted. Here are some practical ways you can make use of them. Use the information to see how much reformulation of your solvents you need in order to come below the threshold of NESHAP (National Emission Standards for Hazardous Air Pollutants). If you emit more than 10 tons per year of a single HAP or 25 tons per year of any combination of HAPs, you come under NESHAP which requires your facility to meet MACT (Maximum Achievable Control Technology) – another set of stringent emission standards under the Clean Air Act. After reviewing their environmental data, some companies have reformulated their paints to reduce their HAPs and are able to opt out of the stringent requirements under the Title V and NESHAP programs.

Look at the wastewater data you have collected under your wastewater discharge permit. Develop a time chart and look for trends or irregularities. Put your data on a spreadsheet and develop a historical correlation between waste loadings and production level. Very often a significant deviation in the spatial (time related) data trend or a sudden change in the production/waste correlation ratio will point to some malfunction somewhere upstream in the production process. It may indicate a significant leakage within your collection system or some wastage of raw material. Since you are legally required to collect such data (such as daily flow, daily concentrations, etc), you might as well make the most out of it. It could end up helping you improve your source reduction or waste minimization programs and save you some money.

 

Data Management

A few words here about data management. Collecting data is a relatively simple task. Trying to figure out what to do with the data is a different matter. The key to getting useful information out of your data is good data management. And the key to good data management is to make sure you have ownership of your own data. You can’t analyze what you don’t have available to you. Do not hand over your data to an outside firm to “manage” them for you. Here is a nightmare scenario that is all too common: You pay an outside firm to collect environmental data for you. The firm puts your data on its proprietary data system and holds on to it. Every time you need access to your data, you have to pay the firm again to retrieve it. Even worse, when you decide to switch firm, you have to pay the existing firm to download your data and the new firm to upload it.

According to Marian Carr, project manager at Locus Technologies – a California based firm that designs environmental data management systems for its clients to operate themselves ( www.locustec.com ) – “there is a very strong desire from both public and private firms to consolidate environmental data under the owner’s control”. Many of her clients have “horror stories” of having to pay consultants to get back their own data or not being able to get the data at all.

If you feel a need to customize your computer database, make sure someone within your organization knows how to run it after it has been customized. If an outside consulting firm is collecting environmental field data for you, insist that the data be stored in a format that is compatible with your own system. And insist on getting the data transferred to your system. This is the only way to keep from being held hostage by your consultants. The last thing you want is to be totally dependent on some outside contractor to tell you what data and information you have on a day-to-day basis. It can get very expensive – dollar-wise and knowledge-wise.

 

About the author
Norman S. Wei is the founder and principal of Environmental Management and Training, LLC., a consulting and training firm based in Union, Washington. He offers regulatory seminars and consulting services throughout the country. He can be reached by email at norman@proactenv.com. His company website is www.proactenv.com.

Locus Team Awarded U.S. Navy Approval for Perchlorate and Petroleum Cleanups

SAN FRANCISCO, Calif., 1 September 2004 — Locus Technologies, a leader in environmental remedial solutions, remedial automation, and environmental information technology, and its teaming partner, Tierra Technologies, today announced that the US Navy has accepted the Closed-Loop Bioreactor Technology under the Naval Facilities Engineering Service Center (NFESC) Environmental Broad Agency Announcement (BAA) program for use on perchlorate clean-ups.

The Closed-Loop Bioreactor Technology was previously accepted by the NAVY NFESC program for use on petroleum sites. The Locus and Tierra team has gone through extensive bench testing and evaluation of field data to demonstrate the effectiveness of the ClosedLoop Bioreactor Technology for degradation of Perchlorate source areas and soils.

The Closed-Loop Bioreactor Technology is unique in that it affords clients a rapid and effective remedial alternative to address perchlorate source and soil contamination without the risk of flushing perchlorate to groundwater. The Closed-Loop Bioreactor Technology is also highly proven in petroleum clean-up applications, with recovery of phase-separated product within 30 to 90 days and the attainment of aquifer quality standards typically within 9 to 12 months.

“We are pleased to expand our support to the Navy and other DOD entities with the availability of this innovative and rapid cleanup technology for perchlorate and petroleum problems,” said Mark Bittner Director of Locus Technologies’ Sacramento and Arizona Regions.

Locus Technologies is involved with several high-profile perchlorate investigation and treatment projects and has established a reputation as an industry leader in emerging contaminates, such as perchlorate and NDMA. Locus often teams with quality firms, such as Tierra Technologies, to provide innovative solutions and unbiased approach to technology selection to its clients.

Locus awarded multi-phase Superfund project with ADEQ

SAN FRANCISCO, Calif., 10 May 2004  — Locus Technologies has been awarded a multi-phase project to support the Arizona Department of Environmental Quality (ADEQ) at the West Central Phoenix, North Canal State Superfund site. The project, which is valued at nearly $300,000, will allow Locus to provide consulting support to the ADEQ for both a Remedial Investigation/Feasibility Study (RI/FS) and Early Response Action (ERA) services.

Locus is pleased to build on its established and in-depth experience at Superfund sites in the west and its extensive resume with chlorinated hydrocarbon investigation and remediation programs.

“Locus staff in Phoenix have extensive prior experience with ADEQ WQARF projects. That experience, coupled with Locus’ outstanding reputation at EPA Region 9 sites, makes for a powerful combination in supporting the ADEQ with this valued public service project,” says Mark Bittner, Locus Regional Director and WCP Project Manager.

Neno Duplancic, President and CEO of Locus, adds “Locus is also providing ADEQ with database management services as an integral part of our RI/FS program. We are excited to provide ADEQ with hands-on exposure to our Environmental Information Management (EIM™) web-based data management software. EIM will allow ADEQ to easily track current and historic groundwater data trends, increasing department efficiency and saving costs.”

Locus to help Companies Comply with Sarbanes-Oxley Act of 2002 for Environmental Liability Management

SAN FRANCISCO, Calif., 13 April 2004 — Locus Technologies (Locus), a leading provider of web-based environmental information management systems, today announced that their new release of LocusFocus portal, scheduled for release in summer 2004, will help public companies comply with the corporate governance requirements in the Sarbanes-Oxley Act which congress passed in 2002 in the law in response to a series of corporate financial crises. LocusFocus already addresses many aspects of management of company’s environmental liability such as analytical data management, auditing, and document management. Additional functionality will include Environmental, Health, and Safety (EH&S) reporting and financial management of environmental liability, which companies are developing in response to governance requirements. As a part of Balance Sheet reporting, publicly traded companies must report their environmental liabilities and reserve.

The Sarbanes-Oxley Act mandates more rigorous corporate governance practices for all aspects of a company’s business including recognition, measurement, display and disclosure of environmental liabilities. Given the increased emphasis on corporate accountability and the penalties and personal liability to CEOs and CFOs for non-compliance, companies are evaluating ways to increase the accuracy of assessing and quantifying environmental liabilities. Locus’s new updated portal is designed to meet this growing need and to provide a tool to help companies organize, manage, and document their environmental liabilities.

“In order to minimize the possibility of erroneous or misleading disclosure, companies increasingly rely on consultants to assist in evaluating internal controls and disclosure procedures, conduct due diligence, analyze and document environmental liabilities, and review existing environmental liability disclosures for compliance with applicable securities laws. Now the companies, their consultants, and legal staff have a tool to document and manage all aspects of environmental liability in a way that was not possible before. By keeping all information about contaminated sites in a single, centralized, secure, web-based system, companies can aggregate information in real time, check the cleanup status of every site, monitor financial performance of consultants and contractors, and most importantly have real time corporate environmental reserve and liability information at their fingertips”, said Neno Duplancic, President and CEO of Locus.

Companies that subscribe to use LocusFocus will make better use of resources and find it easier to comply with US and international environmental requirements, while at the same time lowering their operating costs associated with environmental information management.

By providing a systematic structure for planning, internal auditing and reviewing environmental information, LocusFocus enables companies to meet and exceed environmental requirements as well as enhance their credibility with customers, stakeholders and the public.

Locus announces completion of US EPA Region 5 environmental protection reporting requirements

SAN FRANCISCO, Calif., 5 February 2004 — Locus Technologies (Locus), a leader in environmental information management, announced today that it has expanded its award winning, web-based Environmental Information Management™ (EIM™) system to include the capability of exporting data in compliance with the US EPA Region V Geographic Information System and Field Environmental Decision support system.

The Region V FIELDS software forms the foundation for an EPA system that provides data analysis and interpretation for environmental decision-making. The results allow EPA project managers to evaluate the extent of contamination and hot spot sizes, estimate health risks, prioritize site goals, and weigh potential actions. Users include US EPA Regions, National Oceanic and Atmospheric Administration’s coastal restoration scientists, state and tribal agencies, as well as the private and academic community.

EIM™’s compatibility with Region V’s requirements will open a whole region to the EIM™ data management system. Now, companies and agencies with projects located in Illinois, Indiana, Michigan, Minnesota, Ohio, Wisconsin, and the 35 Tribal Nations in those areas can feel confident when selecting EIM™ to manage their environmental data.

“We are very pleased that EIM™ now provides export ability consistent with US EPA Region V requirements. By bringing EIM™ technology to its customers in the upper Midwest, Locus has provided the first web-based tool to upload and transmit vast amounts of sampling data to EPA Region V from a centralized web system. The EIM™ system links laboratories, clients, and their consultants to EPA Region V through a seamless web-based interface. By leveraging Web Services and XML technologies, Locus continues to provide its customers with a cost-competitive, centralized analytical information management system that is superior to any client-server system available in the marketplace today,” said Dr. Neno Duplancic, President and CEO of Locus Technologies.

“As our client base continues to grow throughout the nation, Locus is committed to meeting all federal and state electronic data deliverables for the environmental industry, including the XML-based, federal SEDD, once it has been approved,” added Dr. Duplancic.

Chevron Environmental Management Company selects Locus’ web portal for environmental laboratory data management

SAN FRANCISCO, 30 September 2003 — Locus Technologies (Locus), a global leader in environmental information management, today announced that Chevron Environmental Management Company, a subsidiary of ChevronTexaco Corporation, San Ramon, California, has selected Locus’s web-based Environmental Information Management System™ (EIM™) for management of Chevron’s Environmental Laboratory Data at their environmental remediation projects. EIM™ is a module of Locus’s award-winning LocusFocus(SM) web portal for environmental information management.

As a part of an effort to consolidate analytical data and provide uniform environmental database management practices across its organization, Chevron Environmental Management Company decided to utilize a web-based system and will start using LocusFocus(SM) immediately.

Using a web-based system to organize and manage large amounts of environmental information, EIM™ provides users real-time access to crucial information that heretofore had been stored in distributed systems accessible to only a few. The development and deployment of these web-based databases requires deep content knowledge and years of experience developing applications for the environmental industry. Locus’s core team has more than 60 years of experience in this area and has worked with clients ranging from numerous Fortune 500 companies to the Department of Energy and the US military. Locus believes its dual expertise in content knowledge and computer applications has enabled it to develop the best-available tool to manage environmental information in existence today. LocusFocus(SM) is built on a robust infrastructure that leverages the latest web technologies, such as XML and Web Services, and utilizes advanced security and backup devices and tools to protect each client’s data.

“Clearly, the bet we are placing on web-based environmental software is a big one. And part of what makes it big is that it encompasses a solution to a problem that could not have been delivered to the environmental industry without the web. Environmental projects generate huge amounts of data that need to be analyzed and put to beneficial use. Complex and expensive decisions hinge on the accessibility, quality, and ease of use of this data, all of which better software tools can help to alleviate. Driving the content and technology solution at the same time requires a lot of work, but it is necessary, if one aspires to lead this industry. We are happy to have Chevron Environmental Management Company join the family of Locus clients,” said Dr. Neno Duplancic, president and CEO of Locus Technologies.

“There are many assets trapped in inefficient environmental information management, including excessive man hours to load laboratory deliverables, search for information, and produce reports. Companies are now pragmatically adopting these new technologies to improve the bottom line of their environmental projects,” added Duplancic.

Locus Announces Completion of New Jersey Department of Environmental Protection Reporting Requirements

SAN FRANCISCO, Calif., 23 July 2003 — Locus Technologies (Locus), a leader in environmental information management, today announced that it has expanded its award winning, web-based Environmental Information Management™ (EIM™ system, to include the capability of importing and exporting data in compliance with New Jersey’s Department of Environmental Protection (NJDEP) Site Remediation Program (SRP).

As part of New Jersey’s participation in the National Environmental Performance Partnership System (NEPPS), the SRP is developing groundwater indicators to show progress in ground water contamination cleanup. The SRP has issued regulations requiring analytical results of sampling data to be submitted electronically and in a GIS-compatible format. Using this information, the SRP plans to delineate changes in the aerial extent of each groundwater plume to evaluate environmental progress in cleaning up contaminated sites.

Addressing the issue of voluminous hardcopy data submissions, the SRP requires that all sites presently being remediated within New Jersey submit their data in electronic format. Moving away from hardcopy data submission has the potential to accelerate the review and statistical manipulation of information, significantly enhancing NJDEP’s ability to service the regulated community and protect the environment and the public. The agency is already collecting massive amounts of data, therefore, the need to be able to process this information quickly and accurately is a growing concern. With Locus’s incorporation of the SRP standard, EIM™ users now have the tools they need to import and export NJDEP data formats.

“We are very pleased that EIM™ now provides interoperability with NJDEP requirements. By bringing EIM™ technology to its customers in New Jersey, Locus has provided the first web-based tool to upload and transmit vast amounts of sampling data to the state from a centralized web system. The EIM™ system links laboratories, clients, and their consultants to the state through a seamless web-based interface. By leveraging Web Services and XML technologies, Locus continues to provide its customers with a cost-competitive, centralized analytical information management system that is superior to any client-server system available in the marketplace today,” said Mr. Neno Duplancic, President and CEO of Locus Technologies.

“Only three months after announcing California’s Water Resources Control Board AB2886 reporting requirements compatibility, Locus has delivered another important state standard. Locus is committed to meeting all federal and state electronic data deliverables for the environmental industry, including the XML-based, federal SEDD, once it has been approved,” added Dr. Duplancic.

Locus announces AB2886 reporting requirements

SAN FRANCISCO, Calif., 13 March 2003 — Locus Technologies (Locus), a leader in environmental information management, today announced that it has expanded its award winning, web-based Environmental Information Management™ (EIM™) system, a part of their LocusFocus(SM) Web portal, to include the capability of importing and exporting Electronic Data Deliverables (EDDs) that meet the state of California’s Water Resources Control Board AB2886 reporting requirements.

The import module includes all data verification and consistency checks outlined in the documentation for the program, as well as on-line forms to view location, well, sample, and analytical information in the AB 2886 format.

The export module allows the user to generate correctly formatted electronic datasets for any of the AB2886-required files. Both modules are intimately linked to other components of the system, thus allowing users to create reports, build graphs, query selected results, and/or download selected datasets into Microsoft Excel or other third-party packages.

These modifications to EIM™ reflect Locus’s commitment to building an enterprise system that allows national or multi-national companies to meet their diverse data management needs and reporting requirements across the U.S. and around the world. Other recent enhancements to the system give companies even more flexibility in customizing the requirements for a given facility or site, while still allowing all the company’s data to reside in a single repository.

“There are many different government-derived or commercial formats of electronic data delivery or reporting produced by analytical laboratories in California and nationwide. While Locus intends to match all these various format requirements by different states or regulatory agencies, the company is also working on pioneering the introduction of extensible markup language (XML)-based EDDs that would facilitate environmental data interchange among various project participants. Today, the advent of Web Services based on XML is making it possible to build, test, and deploy an application of XML-based EDDs that can be used beyond labs and consultants. Because it contains both the specific data required for a transaction or request, as well as the metadata, which describes the data, XML is used to exchange data between different computer systems and different software applications, therefore making EDDs more usable. Best of all, XML doesn’t have to ‘understand’ the underlying software running on the other computer. Locus is raising the issue of XML for EDD, owing to their greater flexibility and increasing use across a variety of fields and industries,” said Mr. Neno Duplancic, president and CEO of Locus.

Interview with Locus Technologies President Neno Duplan, KNBR Radio

Gary Allen on Silicon Valley’s KNBR Business radio interviews Dr. Neno Duplan from Locus Technologies on the environmental challenges faced by tech companies and manufacturers in Silicon Valley.

 

The Internet and Environmental and Geotechnical Data

Geo-Strata

1 January 2003 — “Data, data everywhere and not a drop to use.” Samuel Taylor Coleridge’s original verse was actually about water, but the result is the same for today’s environmental and geotechnical engineers and site owners as it was for the poet’s ancient mariner: drowning in a sea of information that is as unusable as salt water is for drinking.

Investigations, cleanups, and post-closure monitoring and maintenance of contaminated waste sites can generate enormous amounts of data. At large complex
sites, it is not uncommon to drill hundreds of boreholes and wells, collect tens of thousands of samples, and then analyze each of these for several hundred contaminants to ascertain the nature and extent of contamination and geotechnical properties. The information from these various phases, which may eventually include a million or more sampling and analytical records, is typically entered into a database, or worse, into a spreadsheet. With so much data to manage, precious resources are squandered on unproductive administration tasks.

 

What’s usually done?
Most companies with environmental problems do not store their own environmental data. Instead, they rely on their consultants for this service. Larger companies with particularly troublesome or multiple sites are often reticent about “putting all their eggs in one basket” and opt instead to apportion their environmental work among multiple consultants.

Rarely do all consultants use the same environmental database management system. And equally rare is the customer who insists on this. The end result is that the company’s environmental data are stored in various stand-alone or client-server systems at different locales.

If another consultant is hired to do some specialized work, such as risk assessment, data must usually be downloaded into files, then uploaded, and after much “massaging,” installed into the new consultant’s system. Often the data in these systems are not readily accessible to the consultant’s engineers and geologists, or to the companies who actually “own” it. Instead, information requests must go through specialists who know how to extract data from the system.

As for all the various documents and reports, these are often stored in a variety of locales and formats. Considerable time can be lost tracking them down and delivering them to the appropriate personnel. When tasks must be approved from multiple individuals, the necessary documents are sometimes passed sequentially from one person to another, thereby resulting in significant and unnecessary delays at high cost to the client.

All in all, it is not uncommon for environmental and related project information to be handled and processed by dozens of people, in different ways, with few standards or quality control practices governing the various steps in the process, and with no central repository.

With so much information to deal with, it should not come as a surprise that many companies find themselves drowning in data but starving for knowledge.

 

What’s out there?
There is no lack in the marketplace of computerized tools to help companies manage and process this information. However, these typically exist and function as islands of technology rather than as part of an integrated package or system.

Complicating the matter is that these individual tools are sometimes stand-alone applications that need to be installed on each user’s computer whereas others are client-server systems that must be accessed over a dedicated network.

Much rarer is an Internet-based solution. Yet many of the problems and inefficiencies described here can be reduced, if not eliminated, by turning to Internet technologies.

 

What about the Internet?
An easy-to-query Internet-based environmental database management system into which all consultants on a project upload their field and analytical data eliminates the incompatibility and accessibility problems. There is no need to transfer data from one party to another, because all interested parties are able to query and, as needed, download information from the same database using their web browsers. Further inefficiencies can be wrung out of the data acquisition and reporting process by turning to the use of hand-held devices and remote control and automation systems to upload field and sampling data more quickly and reliably. The Internet need not only be used just to store data on site conditions. It can also be used as the primary repository for the various permits, drawings, reports, and other such documents that are generated during the course of a site investigation or cleanup. Having all this information stored in a single place facilitates communication among all interested parties, improves project coordination, and
decreases the overall costs of environmental remediation.

 

What are the obstacles?
Why have most consulting firms made little if any effort to make site-related documents and data accessible over the Web? Explanations for their failure are many but foremost could be their unwillingness to do anything that would reduce their revenues or their clients’ dependence on them.

Because their clients are far removed from the processes of loading data, running queries, and generating reports, they are in no position to pass judgement on, or recommend improvements in, their consultants’ data management practices. On infrequent occasions, a client of a consulting firm will (1) encounter or hear about another environmental information management system, and (2) be sufficiently motivated to look into its pros and cons.

This motivation, however, does not translate into expertise in the area. So in the end, the client will typically turn to its consultant(s) for advice and assistance. I need not spell out the inevitable outcome of this process.

 

What about the future?
In the years ahead, the short shrift given to information management practices and techniques will change, particularly as more and more contaminated waste sites after being cleaned up, enter the O&M or what in some circles has come to be called the long-term stewardship (LTS) phase.

Information management costs, together with those associated with sample collection and analysis and data evaluation and reporting, are expected to consume over half of the expected annual LST budget for sites in this phase. Considering that the LTS phase often lasts for decades and that an estimated 300,000 – 400,000 contaminated sites exist in the United States alone, it is clear that both industry and government face substantial “stewardship” costs in the years ahead.

Because most of these charges will be related to information management, activities and expenses in this area will come under increasing scrutiny from those footing the bill. As a result, firms involved in data collection, storage, and reporting at these sites will be forced to evaluate their practices. In so doing they will come to realize, reluctantly or not, the benefits of adopting Internet-based tools and systems.

For the past three years I have been in charge of the development and implementation of the environmental industry’s first integrated, web-based system for managing and storing sampling and analytical data and project documents. The system includes:

  • An environmental information (analytical data-base) management system
  • Two hand-held applications to record water level readings and compliance data
  • An alternative to traditional GIS that is based on a new Web graphics format and XML-based language called Scalable Vector Graphics
  • Project management tools
  • Automatic emailing and calendar reminders
  • Document storage and retrieval, on-line collaboration opportunities
  • Remote control, automation, and diagnostics of process and treatment systems for water, groundwater, wastewater, air, and soil

I have seen the implementation of remote control and automation technologies and document storage and retrieval tools reduce the monthly costs of monitoring and maintenance at a site of a diesel spill in a remote mountainous area from $10,000 to $1,000 for an investment of only $30,000. I have also seen the data acquisition and reporting costs at a large site in the O&M phase decline by over 20% after the system was implemented.

The only individuals unhappy with this decline are those who were previously “forced” to either snowmobile or ski into the site during the winter months when the roads to it were impassable.

By adopting such new monitoring, database, and web technologies, a typical Fortune-100 company with a portfolio of 50 sites, whose net present value long term (30-years) monitoring costs are in the $100 million range, could lower these expenditures by $30 million dollars or mores.

If these numbers and predictions are correct, industry and government stand to benefit immensely in the years ahead from increased usage of the Internet as the primary repository and vehicle for the storage and delivery of environmental information and documents.