By Norman S. Wei, Courtesy of Environmental Management and Training, LLC.

MOUNTAIN VIEW, Calif., 1 January 2020 — A recent article in the Harvard Business Review entitled “IT Doesn’t Matter” generated a lot of discussions and an enormous amount of controversy. IT stands for Information Technology. The premise of the article is that information technology is becoming less relevant as a strategic management tool because it has become more accessible and affordable to all. The author argues that since every corporation is deep into IT, no one company can gain any significant strategic advantage by embracing it. In other words, IT has become a commodity. It has now become a necessary but insufficient condition for excellence.

A natural outcome of the Information Technology age is the ease of collecting massive amount of data. As a result, we often find ourselves drowning in data while starving for information! This column discusses some practical ways of managing your environmental data and how best to get the most information out of such data.

Collecting data is relatively easy to do but it does come with costs. Mistakes in data collection can lead to disastrous results. A classic example of data collection run amok is Total Quality Management (TQM) – an excellent management concept of “doing it right the first time”. In the 1990s, TQM was all the rage. Large and small corporations were embracing it. Unfortunately, many TQM administrators began to demand their employees to fill out forms to document and justify every decision as part of the TQM process. It got to the point where employees were spending so much time in filling out forms and preparing internal reports that the system collapsed on its own paper weight. People were collecting data for the sake of collecting data and not analyzing the data to improve management practices. They were drowning in data but starving for information. Once senior management discovered such waste of resources, TQM program was shut down.

Getting Information out of Data

About ten years ago, the consulting firm I worked for got a multi-million dollar contract to work on a large Superfund site in California . The previous consulting firm had installed about 100 monitor wells on site and were collecting massive amount of groundwater quality data. There were reams of computer printout data showing waste solvent concentrations at varying depths at each well over an extended time period. Yet no one had ever bothered to sit down and analyze the data as they were being collected to determine their efficacy. The data kept rolling in from the field. The Superfund site was drowning in data but starving for information. Our firm was then hired to make sense of the data – to interpret the groundwater contamination data to see in which direction the TCE was migrating in the aquifer. Our job was to provide our clients with useful information so that they could go and negotiate a settlement with EPA on how best to remediate the site. It became clear later the previous firm kept on collecting data. It was much more profitable (translation: more billable hours) to have a large team of technicians out in the field generating data than to have a small team of scientists analyzing them.

There is certainly no shortage of data in the environmental field. Every environmental permit requires the permit holder to collect one form of data or another. If you have a wastewater discharge permit, chances are that you are required by law to collect daily flow data, weekly effluent concentrations and calculate monthly averages of various chemical constituents in your waste stream. If you generate hazardous wastes, you are required by law to record how much of what wastes you ship out to a Treatment Storage and Disposal Facility (TSDF). If you have a major air permit, the government requires you to collect data on how many tons of HAPs (hazardous air pollutants) you send up the stack. Every July 1 of each year, many facilities are required by federal law to tell EPA how much chemicals they used, processes or manufactured in the preceding year and how much of the chemicals were “released” to the environment. This is the so-called Toxic Release Inventory Report (or better known as Form R).

The questions facing us are: How can we make the best use of the data we are required by law to collect? How can we benefit from such data? What useful information can we get out of these data?

Let’s start with the often misused (by the environmental zealots) and misunderstood (by the media) TRI database. A great majority of the data contained in the TRI database is based on legally permitted emissions and/or recycling activities. Yet they are all generically termed as “release to the environment”. That’s why every year the public reads about XYZ company being the “largest” polluter in the county because it “releases” so many tons of toxic chemicals to the community. Leaving this injustice behind, you should review the raw data that you use to compile your TRI report and determine how effectively you have been recycling your wastes offsite and how well you have controlled your air emissions. Use your own TRI data as a basis for your internal audit or review of your operations. If you are planning on acquiring a company, the EPA’s TRI data website address is is a good place to start your due diligence.

The EPA also compiles a large database on the hazardous waste activities nationwide based on the hazardous wastes generation reports that all large quantity generators send in every even-numbered year. These biennial reports – downloadable from the EPA’s website – can tell you a lot of information.
The latest available database is for year 1999 and can be downloaded from It can tell you who have been shipping what type of hazardous wastes to which TSDF. Once you have downloaded the data into your computer, you can import it into a relational database program such as Microsoft Access and analyze the data. One of the main reasons why you want to know who ship what wastes to a site that you are considering is to avoid shipping wastes to a site that receives most of its wastes from small companies. If the site turns into a Superfund site and those small companies go out of business, your company will be forced to bear their shares of the cleanup cost under the “joint and several liability” clause of CERCLA. The joint and several liability means each PRP (Potentially Responsible Party) associated with the Superfund site can be held individually responsible for the entire cleanup cost of the site. So that’s why you need to choose a site that has many financially viable PRPs in order to minimize your potential liability.

Use the air emission data you collect under your air permit to tell you how much VOCs (Volatile Organic Compounds) and HAPs (Hazardous Air Pollutants) are being emitted. Here are some practical ways you can make use of them. Use the information to see how much reformulation of your solvents you need in order to come below the threshold of NESHAP (National Emission Standards for Hazardous Air Pollutants). If you emit more than 10 tons per year of a single HAP or 25 tons per year of any combination of HAPs, you come under NESHAP which requires your facility to meet MACT (Maximum Achievable Control Technology) – another set of stringent emission standards under the Clean Air Act. After reviewing their environmental data, some companies have reformulated their paints to reduce their HAPs and are able to opt out of the stringent requirements under the Title V and NESHAP programs.

Look at the wastewater data you have collected under your wastewater discharge permit. Develop a time chart and look for trends or irregularities. Put your data on a spreadsheet and develop a historical correlation between waste loadings and production level. Very often a significant deviation in the spatial (time related) data trend or a sudden change in the production/waste correlation ratio will point to some malfunction somewhere upstream in the production process. It may indicate a significant leakage within your collection system or some wastage of raw material. Since you are legally required to collect such data (such as daily flow, daily concentrations, etc), you might as well make the most out of it. It could end up helping you improve your source reduction or waste minimization programs and save you some money.


Data Management

A few words here about data management. Collecting data is a relatively simple task. Trying to figure out what to do with the data is a different matter. The key to getting useful information out of your data is good data management. And the key to good data management is to make sure you have ownership of your own data. You can’t analyze what you don’t have available to you. Do not hand over your data to an outside firm to “manage” them for you. Here is a nightmare scenario that is all too common: You pay an outside firm to collect environmental data for you. The firm puts your data on its proprietary data system and holds on to it. Every time you need access to your data, you have to pay the firm again to retrieve it. Even worse, when you decide to switch firm, you have to pay the existing firm to download your data and the new firm to upload it.

According to Marian Carr, project manager at Locus Technologies – a California based firm that designs environmental data management systems for its clients to operate themselves ( ) – “there is a very strong desire from both public and private firms to consolidate environmental data under the owner’s control”. Many of her clients have “horror stories” of having to pay consultants to get back their own data or not being able to get the data at all.

If you feel a need to customize your computer database, make sure someone within your organization knows how to run it after it has been customized. If an outside consulting firm is collecting environmental field data for you, insist that the data be stored in a format that is compatible with your own system. And insist on getting the data transferred to your system. This is the only way to keep from being held hostage by your consultants. The last thing you want is to be totally dependent on some outside contractor to tell you what data and information you have on a day-to-day basis. It can get very expensive – dollar-wise and knowledge-wise.


About the author
Norman S. Wei is the founder and principal of Environmental Management and Training, LLC., a consulting and training firm based in Union, Washington. He offers regulatory seminars and consulting services throughout the country. He can be reached by email at His company website is

SAN FRANCISCO, Calif., 1 September 2004 — Locus Technologies, a leader in environmental remedial solutions, remedial automation, and environmental information technology, and its teaming partner, Tierra Technologies, today announced that the US Navy has accepted the Closed-Loop Bioreactor Technology under the Naval Facilities Engineering Service Center (NFESC) Environmental Broad Agency Announcement (BAA) program for use on perchlorate clean-ups.

The Closed-Loop Bioreactor Technology was previously accepted by the NAVY NFESC program for use on petroleum sites. The Locus and Tierra team has gone through extensive bench testing and evaluation of field data to demonstrate the effectiveness of the ClosedLoop Bioreactor Technology for degradation of Perchlorate source areas and soils.

The Closed-Loop Bioreactor Technology is unique in that it affords clients a rapid and effective remedial alternative to address perchlorate source and soil contamination without the risk of flushing perchlorate to groundwater. The Closed-Loop Bioreactor Technology is also highly proven in petroleum clean-up applications, with recovery of phase-separated product within 30 to 90 days and the attainment of aquifer quality standards typically within 9 to 12 months.

“We are pleased to expand our support to the Navy and other DOD entities with the availability of this innovative and rapid cleanup technology for perchlorate and petroleum problems,” said Mark Bittner Director of Locus Technologies’ Sacramento and Arizona Regions.

Locus Technologies is involved with several high-profile perchlorate investigation and treatment projects and has established a reputation as an industry leader in emerging contaminates, such as perchlorate and NDMA. Locus often teams with quality firms, such as Tierra Technologies, to provide innovative solutions and unbiased approach to technology selection to its clients.

SAN FRANCISCO, Calif., 10 May 2004  — Locus Technologies has been awarded a multi-phase project to support the Arizona Department of Environmental Quality (ADEQ) at the West Central Phoenix, North Canal State Superfund site. The project, which is valued at nearly $300,000, will allow Locus to provide consulting support to the ADEQ for both a Remedial Investigation/Feasibility Study (RI/FS) and Early Response Action (ERA) services.

Locus is pleased to build on its established and in-depth experience at Superfund sites in the west and its extensive resume with chlorinated hydrocarbon investigation and remediation programs.

“Locus staff in Phoenix have extensive prior experience with ADEQ WQARF projects. That experience, coupled with Locus’ outstanding reputation at EPA Region 9 sites, makes for a powerful combination in supporting the ADEQ with this valued public service project,” says Mark Bittner, Locus Regional Director and WCP Project Manager.

Neno Duplancic, President and CEO of Locus, adds “Locus is also providing ADEQ with database management services as an integral part of our RI/FS program. We are excited to provide ADEQ with hands-on exposure to our Environmental Information Management (EIM™) web-based data management software. EIM will allow ADEQ to easily track current and historic groundwater data trends, increasing department efficiency and saving costs.”

SAN FRANCISCO, Calif., 13 April 2004 — Locus Technologies (Locus), a leading provider of web-based environmental information management systems, today announced that their new release of LocusFocus portal, scheduled for release in summer 2004, will help public companies comply with the corporate governance requirements in the Sarbanes-Oxley Act which congress passed in 2002 in the law in response to a series of corporate financial crises. LocusFocus already addresses many aspects of management of company’s environmental liability such as analytical data management, auditing, and document management. Additional functionality will include Environmental, Health, and Safety (EH&S) reporting and financial management of environmental liability, which companies are developing in response to governance requirements. As a part of Balance Sheet reporting, publicly traded companies must report their environmental liabilities and reserve.

The Sarbanes-Oxley Act mandates more rigorous corporate governance practices for all aspects of a company’s business including recognition, measurement, display and disclosure of environmental liabilities. Given the increased emphasis on corporate accountability and the penalties and personal liability to CEOs and CFOs for non-compliance, companies are evaluating ways to increase the accuracy of assessing and quantifying environmental liabilities. Locus’s new updated portal is designed to meet this growing need and to provide a tool to help companies organize, manage, and document their environmental liabilities.

“In order to minimize the possibility of erroneous or misleading disclosure, companies increasingly rely on consultants to assist in evaluating internal controls and disclosure procedures, conduct due diligence, analyze and document environmental liabilities, and review existing environmental liability disclosures for compliance with applicable securities laws. Now the companies, their consultants, and legal staff have a tool to document and manage all aspects of environmental liability in a way that was not possible before. By keeping all information about contaminated sites in a single, centralized, secure, web-based system, companies can aggregate information in real time, check the cleanup status of every site, monitor financial performance of consultants and contractors, and most importantly have real time corporate environmental reserve and liability information at their fingertips”, said Neno Duplancic, President and CEO of Locus.

Companies that subscribe to use LocusFocus will make better use of resources and find it easier to comply with US and international environmental requirements, while at the same time lowering their operating costs associated with environmental information management.

By providing a systematic structure for planning, internal auditing and reviewing environmental information, LocusFocus enables companies to meet and exceed environmental requirements as well as enhance their credibility with customers, stakeholders and the public.

SAN FRANCISCO, Calif., 5 February 2004 — Locus Technologies (Locus), a leader in environmental information management, announced today that it has expanded its award winning, web-based Environmental Information Management™ (EIM™) system to include the capability of exporting data in compliance with the US EPA Region V Geographic Information System and Field Environmental Decision support system.

The Region V FIELDS software forms the foundation for an EPA system that provides data analysis and interpretation for environmental decision-making. The results allow EPA project managers to evaluate the extent of contamination and hot spot sizes, estimate health risks, prioritize site goals, and weigh potential actions. Users include US EPA Regions, National Oceanic and Atmospheric Administration’s coastal restoration scientists, state and tribal agencies, as well as the private and academic community.

EIM™’s compatibility with Region V’s requirements will open a whole region to the EIM™ data management system. Now, companies and agencies with projects located in Illinois, Indiana, Michigan, Minnesota, Ohio, Wisconsin, and the 35 Tribal Nations in those areas can feel confident when selecting EIM™ to manage their environmental data.

“We are very pleased that EIM™ now provides export ability consistent with US EPA Region V requirements. By bringing EIM™ technology to its customers in the upper Midwest, Locus has provided the first web-based tool to upload and transmit vast amounts of sampling data to EPA Region V from a centralized web system. The EIM™ system links laboratories, clients, and their consultants to EPA Region V through a seamless web-based interface. By leveraging Web Services and XML technologies, Locus continues to provide its customers with a cost-competitive, centralized analytical information management system that is superior to any client-server system available in the marketplace today,” said Dr. Neno Duplancic, President and CEO of Locus Technologies.

“As our client base continues to grow throughout the nation, Locus is committed to meeting all federal and state electronic data deliverables for the environmental industry, including the XML-based, federal SEDD, once it has been approved,” added Dr. Duplancic.

SAN FRANCISCO, Calif., 23 July 2003 — Locus Technologies (Locus), a leader in environmental information management, today announced that it has expanded its award winning, web-based Environmental Information Management™ (EIM™ system, to include the capability of importing and exporting data in compliance with New Jersey’s Department of Environmental Protection (NJDEP) Site Remediation Program (SRP).

As part of New Jersey’s participation in the National Environmental Performance Partnership System (NEPPS), the SRP is developing groundwater indicators to show progress in ground water contamination cleanup. The SRP has issued regulations requiring analytical results of sampling data to be submitted electronically and in a GIS-compatible format. Using this information, the SRP plans to delineate changes in the aerial extent of each groundwater plume to evaluate environmental progress in cleaning up contaminated sites.

Addressing the issue of voluminous hardcopy data submissions, the SRP requires that all sites presently being remediated within New Jersey submit their data in electronic format. Moving away from hardcopy data submission has the potential to accelerate the review and statistical manipulation of information, significantly enhancing NJDEP’s ability to service the regulated community and protect the environment and the public. The agency is already collecting massive amounts of data, therefore, the need to be able to process this information quickly and accurately is a growing concern. With Locus’s incorporation of the SRP standard, EIM™ users now have the tools they need to import and export NJDEP data formats.

“We are very pleased that EIM™ now provides interoperability with NJDEP requirements. By bringing EIM™ technology to its customers in New Jersey, Locus has provided the first web-based tool to upload and transmit vast amounts of sampling data to the state from a centralized web system. The EIM™ system links laboratories, clients, and their consultants to the state through a seamless web-based interface. By leveraging Web Services and XML technologies, Locus continues to provide its customers with a cost-competitive, centralized analytical information management system that is superior to any client-server system available in the marketplace today,” said Mr. Neno Duplancic, President and CEO of Locus Technologies.

“Only three months after announcing California’s Water Resources Control Board AB2886 reporting requirements compatibility, Locus has delivered another important state standard. Locus is committed to meeting all federal and state electronic data deliverables for the environmental industry, including the XML-based, federal SEDD, once it has been approved,” added Dr. Duplancic.