Technology and data may seem abstract, but when it comes to environmental pollution or sustainability, they can be a matter of life and death. Several popular films based on actual events have seared into the public consciousness the consequences of mishandling or ignoring environmental data. Dark Waters (2019), Erin Brockovich (2000), and A Civil Action (1998) each dramatize scenarios in which communities were poisoned while companies or authorities failed to address warning signs adequately. These stories, alongside real-world case records, provide a stark cultural context for discussing data stewardship. Likewise, the 2023 film Oppenheimer, although about the creation of the atomic bomb, serves as a poignant metaphor for scientific responsibility. Let’s briefly look at each and the common thread between them:
- Dark Waters – This film tells the story of attorney Rob Bilott’s battle against DuPont, which had been releasing a toxic chemical (PFOA, used in Teflon) into communities in West Virginia for decades. One of the most disturbing revelations in the real case was that DuPont and other manufacturers were aware of the health hazards of PFOA long before the public was, but chose to conceal that information. Internal studies linked the chemical to organ damage and cancer, and DuPont even found PFOA in local water supplies but did not disclose it. In essence, they had the data that could have alerted regulators and residents to the danger, yet suppressed it. The result was widespread contamination – virtually an entire region exposed unknowingly – and numerous people developing illnesses like cancer. Dark Waters demonstrates that data mismanagement was not merely a clerical issue, but a moral failing: had DuPont been transparent and proactive, much suffering might have been averted. The film, and the events it’s based on, underscore why environmental data must be handled with honesty and urgency. It is a cautionary tale: if companies don’t police themselves with the data they have, eventually someone else will, at great expense.
In February 2017, DuPont and Chemours agreed to pay $670.7 million to settle about 3,550 personal-injury claims tied to PFOA contamination near DuPont’s Washington Works plant (the case portrayed in Dark Waters). The earlier 2004 class-action settlement set up the independent C8 Science Panel and a medical-monitoring program; later, DuPont, Chemours, and Corteva separately created a $1.185 billion fund for U.S. public water systems. The money is well used to prevent similar cases in the future.
- Erin Brockovich – Perhaps the most famous environmental whistleblower story, this film recounts the investigation of a cluster of illnesses in Hinkley, California, caused by hexavalent chromium (Chromium-6) from a Pacific Gas & Electric (PG&E) facility leaking into the town’s groundwater. In the 1960s, PG&E had dumped chromium-tainted wastewater into unlined ponds, and by the 1980s, residents were reporting cancers, chronic rashes, and other ailments. What Erin Brockovich (a legal clerk turned activist) discovered was that PG&E knew its operations had contaminated the water and had data confirming this, yet had not only failed to inform the community but had also actively misled them at times. The company had to monitor wells and tests – i.e., they had collected the data – but didn’t act on it in a way that protected the public. The eventual lawsuit, settled in 1996 for $333 million (then a record amount), hinged on proving PG&E’s negligence and concealment of information. For our discussion, the lesson is clear: collecting data is meaningless if it’s not used to prevent harm. In Hinkley, the data sat in file cabinets while people fell ill.
Erin Brockovich’s work was essentially to drag that data into the light and force action. Today’s environmental managers watching that film would likely think, “If only there had been a requirement to report those water sample results in real time publicly, this might have been addressed sooner.” Modern environmental regulations do move in that direction. For instance, the EPA now requires specific public water systems to report water quality data to consumers annually, partly to prevent another Hinkley scenario. The Brockovich case reinforces the importance of transparency and quick response, which robust data systems can facilitate.
- A Civil Action – This film (and the book it’s based on) tells the story of Woburn, Massachusetts, where, in the 1970s and 80s, industrial solvents (such as trichloroethylene) from local factories contaminated the town’s drinking water aquifer. The result was an outbreak of childhood leukemia and other health issues in the community. What’s instructive here is the role of environmental data, or the lack thereof, in the unfolding tragedy. Residents suspected something was wrong with the water (it smelled and tasted foul), but it took years for the contamination to be formally acknowledged. Once lawsuits began, data became key evidence: tests eventually showed high levels of toxins in the wells, and health statistics showed a leukemia cluster far above the normal rate.
W.R. Grace paid the families about $8 million in September 1986. Before trial, UniFirst settled separately for $1.05 million. Years later, EPA finalized a $69.5 million Superfund cleanup settlement with five responsible parties (including W.R. Grace and Beatrice). The jury found W.R. Grace liable; Beatrice Foods was found not liable at trial. Beatrice Foods was dismantled after a 1986 leveraged buyout and ceased to exist as an independent U.S. company by 1990. W.R. Grace later went through bankruptcy tied to asbestos liabilities and settled with EPA, then continued operating as a much smaller company.
The movie, “A Civil Action,” dramatizes the lengthy and complex lawsuit surrounding the contamination of drinking water wells in Woburn and the subsequent illnesses, including eight childhood leukemia deaths. Here, the data was eventually obtained and used in court, but by then, the damage had already been done. The case led to a large settlement and was one impetus for the creation of the EPA’s Superfund program (to tackle such sites more aggressively). The takeaway is that early data gathering and intervention could have saved lives. It’s a story that justifies why we have continuous monitoring at many sites today and why we strive to detect anomalies early. In Woburn’s time, if there had been better groundwater monitoring and required reporting, perhaps those wells would have been shut down before people fell ill. Modern systems, like Locus’s, can integrate health and environmental data to some extent; for instance, if abnormal illness reports occur near a facility, they could be cross-referenced with environmental release data. It’s a level of integration that is still evolving, but Woburn’s legacy is a driver for it.
Together, these films (and the actual events they depict) send a powerful message: environmental data carries a profound responsibility. It’s not just numbers on a spreadsheet; it represents the health of real people and ecosystems. Ignoring or hiding environmental data is portrayed as morally reprehensible in these stories, and rightly so. In contrast, sharing data and acting on it early is shown as heroic (the Erin Brockovich story ends with justice, but also the lingering reality that the community still suffers decades later, reminding us that even justice after the fact is too late for the victims).
Finally, let’s touch on Oppenheimer. This might seem unrelated; it’s about the development of nuclear weapons, but the thematic link is the ethical burden that scientists and engineers face. J. Robert Oppenheimer and his colleagues gathered data (physics experiments) and developed a technology that changed the world, and then had to reckon with the consequences of its use. Many of them argued for control and responsibility over nuclear technology after witnessing its destructive power. In environmental science, the parallel is that those of us who collect and analyze environmental data must also reckon with our responsibility. We might not be splitting the atom, but we are often the first to see signs of a threat (like a toxin in a water sample). Once we have that knowledge, the onus is on us to speak up and ensure it’s used to protect people. In one scene from Oppenheimer, the scientists realize the results of their work could potentially ignite the atmosphere – a terrifying data point to consider. In environmental work, while the stakes are usually less cosmic, the principle is the same: you might find that a factory’s waste is poisoning a river; such knowledge should ignite action.
In a way, Oppenheimer’s story has become a cautionary tale about the dual-use of scientific progress: it can save or destroy. Environmental data is similar: when used effectively (to inform cleanup, regulation, and safer practices), it can save communities; when suppressed or mishandled, it can lead to disaster. The Oppenheimer film’s cultural impact has revived discussion about scientists’ accountability, with viewers drawing connections to climate change and other fields where experts bear the burden of warning the world. Environmental professionals managing big data should feel that same weight of responsibility: we must ensure that data is accurate, timely, and delivered to the right people who can act on it. We must advocate for solutions when the data reveals danger, just as Oppenheimer eventually advocated for international control of nuclear weapons when he saw the peril they posed.
Conclusion: The Foresight of Visionaries and the Path Forward
Reflecting on Neno Duplan and Gregory Buckle’s predictions from over 30 years ago, one cannot help but be impressed by their foresight. They anticipated a future where environmental engineers would lean on artificial intelligence and automated systems to manage an explosion of data – a future that is very much our present. Their 1989 call for “expert systems… interpreting and processing large bodies of information” was a glimpse into what has become commonplace: from automated data validation to AI-driven analytics. The accuracy of their vision is evident each time an environmental manager uses a cloud dashboard to decide where to sample next, or when an algorithm in an environmental software flags an anomalous reading for investigation.
Yet, the journey from vision to reality was not automatic; it required pioneers like them (and companies like Locus) to execute on those ideas. The development of systems like ITEMS in the early 90s and the continuous innovation through the 2000s and 2010s laid the groundwork for the powerful environmental informatics tools we have today. In many respects, Locus’s platforms are the direct descendants of those early efforts, enriched by cloud computing, modern programming, AI, and vastly greater computing power. Locus anticipated the importance of cloud-based environmental data management and compliance long before it became industry-standard, demonstrating leadership in transforming the concept of centralized, accessible environmental data into a practical service. As a result, organizations now can handle their “hazardous data” with far more confidence and clarity – whether it’s a complex Superfund site, drinking water quality, or a corporate sustainability program tracking emissions.
The importance of responsible data stewardship cannot be overstated. We have seen what happens in its absence: poisons in drinking water, communities betrayed, costly legacies of pollution. On the other hand, when data is appropriately managed and transparently shared, it empowers all stakeholders – regulators can enforce more effective policies, companies can enhance their practices, communities can remain informed, and scientists can advance their understanding. The early visionaries understood that merely collecting data was not enough; one had to close the loop by interpreting and acting on it. Today’s best practices in environmental management finally embody that principle. For example, some companies now integrate sensor networks with automated response systems – if contamination is detected, not only is an alert sent out, but a pump-and-treat system may also automatically activate. This is effectively real-time remediation triggered by data, which is precisely the kind of outcome early AI proponents hoped for (faster, more effective cleanups).
Looking ahead, with challenges such as climate change and PFAS, or “forever chemicals,” emerging, the volume and complexity of environmental data will continue to grow. We will likely need AI and automation even more deeply integrated – possibly AI that can predict environmental risks before they fully materialize. Locus is already exploring the use of its rich datasets to train predictive models, for instance, to anticipate where groundwater might become unsafe or to optimize remediation strategies. The company’s foresight in accumulating high-quality, geospatially tagged historical data sets enables it to contribute to AI models that benefit not just one client, but society at large (for example, predicting the spread of wildfire smoke or the impact of a chemical spill using patterns learned from past data). This is a frontier that Duplan and Buckle might find very familiar in spirit: using advanced technology to make sense of complexity and guide decisions in a timely way.
In conclusion, the predictions made in 1989 and 1992 about AI, automation, and big data in environmental compliance have been mainly validated by subsequent developments. More importantly, those predictions have helped drive a cultural shift in how we view environmental data, from an overwhelming by-product of regulatory compliance to a valuable asset for proactive environmental protection. The marriage of technology and environmental science envisioned by those authors has given environmental professionals “superpowers” of analysis and insight that the previous generation could only dream of. With those powers comes the duty to use them wisely. As we’ve learned through challenging experience (and as popular culture has echoed), data can either be the hero or the villain in environmental stories. It is up to us, engineers, scientists, software developers, company leaders, and regulators, to ensure it is the hero by committing to transparency, swift action on findings, and continuous improvement of our tools.
Locus and its peers exemplify how industry can take the lead in this endeavor by providing platforms that not only manage data but also actively help interpret, visualize, communicate, and report it. It is a fitting legacy for the likes of Duplan and Buckle that their early work continues to resonate. Their insight and Locus’s ongoing innovation together highlight a bright path forward: one where big data and AI are harnessed in the service of environmental health, where we anticipate problems before they escalate, and where “taming” environmental data means safer communities and a cleaner planet for generations to come.
Sources:
- Duplancic, Neno & Buckle, Gregory. “Hazardous Data Explosion.” Civil Engineering (ASCE), Dec. 1989, pp. 68-70.
- Duplancic, Neno & Buckle, Gregory. “Taming Environmental Data.” Civil Engineering (ASCE), Aug. 1992, pp. 56-58.
- Duplancic, Neno & McEvoy, Steve, & Schweizer, John. “Automatic Savings” Civil Engineering (ASCE), Aug. 1998.
- Union of Concerned Scientists. “DuPont, 3M Concealed Evidence of PFAS Risks.” Mar. 22, 2019.
- U.S. EPA. “‘A Civil Action’ – An Everyday Drama at the EPA,” News Release, Jan. 20, 1999 .
- ABC News. “Erin Brockovich: the real story of the town decades later,” May 2021.
- Locus Technologies Press Release. “Locus Awarded Contract to Manage Los Alamos Lab’s Environmental Data,” Mar. 21, 2011.
- Locus Technologies Blog. “Locus’ Intellus Site Creates Big Data Transparency in the Cloud,” Oct. 2014.
- Locus Technologies Blog. “Artificial Intelligence & Blockchain Applied to Water & Energy,” originally in EBJ, Jun. 2020.
Neno Duplan
Founder & CEO
As Founder and CEO of Locus Technologies, Dr. Duplan spent his career combining his understanding of environmental science with a vision of how to gather, aggregate, organize, and analyze environmental data to help organizations better manage and report their environmental and sustainability footprints. During the 1980’s, while conducting research as a graduate student at Carnegie Mellon, Dr. Duplan developed the first prototype system for an environmental information management database. This discovery eventually lead to the formation of Locus Technologies in 1997.
As technology evolved and new guidelines for environmental stewardship expanded, so has the vision Dr. Duplan has held for Locus. With the company’s deployment of the world’s first commercial Software-as-Service (SaaS) product for environmental information management in 1999 to the Locus Mobile solution in 2014, today Dr. Duplan continues to lead and challenge his team to be the leading provider of cloud-based EH&S and sustainability software.
Dr. Duplan holds a Ph.D. in Civil Engineering from the University of Zagreb, Croatia, an M.S. in Civil Engineering from Carnegie-Mellon, and a B.S. in Civil Engineering from the University of Split, Croatia. He also attended advanced Management Training at Stanford University.
Locus is the only self-funded water, air, soil, biological, energy, and waste EHS software company that is still owned and managed by its founder. The brightest minds in environmental science, embodied carbon, CO2 emissions, refrigerants, and PFAS hang their hats at Locus, and they’ve helped us to become a market leader in EHS software. Every client-facing employee at Locus has an advanced degree in science or professional EHS experience, and they incubate new ideas every day – such as how machine learning, AI, blockchain, and the Internet of Things will up the ante for EHS software, ESG, and sustainability.