BAD EPA DATA EQUALS BAD POLICY
January 12, 2006
Since the 1970s, the Environmental Protection Agency (EPA) has forced the U.S. business community to spend tens of billions of dollars unnecessarily to address what might be phantom risks generated by faulty EPA databases, says William Kovacs, vice president of the U.S. Chamber of Commerce's Environment, Technology and Regulatory Affairs Division.
The root cause of the problem is regulatory decision-making driven by risk assessments based on faulty physical chemical property data disseminated by the EPA, says Kovacs; but the EPA has failed to address this matter even though it has been directly brought to its attention.
According to Kovacs:
- The physical chemical data at issue are resident in databases and/or generated by models disseminated by the EPA; much of the data are faulty and in some instances, egregiously so.
- The problem was first reported by scientists at the U.S. Geological Survey (USGS); their analysis of the pesticide DDT and its metabolite DDE revealed the data are so bad they cannot be used to assess how these chemicals disburse in the environment.
- Subsequently, a scientist at Eastman Kodak evaluated thousands of physical chemical property data entries for many chemicals identified in databases and models disseminated by the EPA and revealed that much of the data are unreliable.
Having confirmed the reality of the problem, the U.S. Chamber of Commerce issued a Data Quality Act (DQA) Request for Correction of the faulty data, but the EPA has dismissed the request and whether they will ever address the issues remains unseen, says Kovacs.
Yet, improvement of the data could save business and industry many hundreds of millions and possibly billions of dollars in compliance costs. Moreover, using good quality data will enhance the EPA's reputation, says Kovacs.
Source: William L. Kovacs, "Bad Data in EPA Databases Result in Bad Policy," Environment and Climate: Heartland Institute, November 1, 2005.
Browse more articles on Environment Issues