L. W. McMahon, J. L. Miranda, and L. D. Welch
The overall goal of a well-designed and well-implemented sampling and analysis program is to measure accurately what is really there. Environmental decisions are made on the assumption that analytical results are, within known limits of accuracy and precision, representative of site conditions. Many sources of error exist that could affect the analytical results. Factors to consider as sources of error include improper sample collection, handling, preservation, and transport; inadequate personnel training; and poor analytical methods, data reporting, and record keeping. A quality assurance (QA) program is designed to minimize these sources of error and to control all phases of the monitoring process.
The application of a quality assurance/quality control (QA/QC) program for environmental monitoring activities at the ORR is essential to generating data of known and defensible quality. Each aspect of the environmental monitoring program, from sample collection to data management, must address and meet applicable quality standards.
The 1995 QA/QC results for the three sites have been compiled into a summary that represents the performance of the reservation as a whole. In past years, the results were reported separately for each of the three site analytical laboratories. The three laboratories were recently combined into a single entity, the Analytical Services Organization. The 1995 results are based on data from the Analytical Services Organization, the ORNL Environmental Sciences Division, the ORNL Industrial Hygiene Department and the K-25 Site Technical Division.
8.2 FIELD SAMPLING QUALITY ASSURANCE
Field sampling QA encompasses many practices that minimize error and evaluate sampling performance. Some key quality practices include the following:
Preparation of SOPs is a continually evolving process. In 1988, the Environmental Surveillance Procedures QC Program was issued for use by Energy Systems, with oversight by DOE-ORO and the EPA.
A process is in place for continuous improvement in the field sampling QA program and for incorporation of new procedures to reflect changing technologies and regulatory protocols. An Environmental Surveillance Procedures QC Committee is tasked with updating the field sampling and QC procedures. Membership in the committee includes representatives from each of the five Energy Systems facilities, DOE, ER, Central Waste Management, and the Analytical Services Organization. The committee ensures that requirements from relevant federal and state regulations are incorporated into the procedures and that new procedures are incorporated only after appropriate review and approval. In addition, site specific procedures are reviewed internally.
Because of changing technologies and regulatory protocols, training of field personnel is a continuing process. To ensure that qualified personnel are available for the array of sampling tasks within Energy Systems, training programs by EPA as well as private contractors have been used to supplement internal training. Examples of topics addressed include the following:
8.3 ANALYTICAL QUALITY ASSURANCE
The Energy Systems analytical laboratories have well-established QA/QC programs, well-trained and highly qualified staff, and excellent equipment and facilities. Current, approved analytical methodologies employing good laboratory and measurement control practices are used routinely to ensure analytical reliability. The analytical laboratories conduct extensive internal QC programs with a high degree of accuracy, participate in several external QC programs, and use statistics to evaluate and to continuously improve performance. Thus, QA and QC are daily responsibilities of all employees.
8.3.1 Internal Quality Control
Analytical activities are supported by the use of standard materials or
reference materials (e.g., materials of known composition that are used
in the calibration of instruments, methods standardization, spike
additions for recovery tests, and other practices). Certified standards
from the National Institute of Standards and Technology
(NIST),
EPA , or other
DOE laboratories are used for
such work. The laboratories operate under specific QA/QC criteria at
each installation. Additionally, separate QA/QC documents relating to
analysis of environmental samples associated with regulatory
requirements are developed.
QA/QC measurement control programs external to the sample analysis groups have single-blind control samples submitted to the analytical laboratories to monitor performance. The results of such periodic measurement programs are statistically evaluated and reported to the laboratories and their customers. Most reports are issued quarterly, and some laboratories compile annual summary reports. These reports assist in evaluating the adequacy of analytical support programs and procedures. If serious deviations are noted by the QC groups, the operating laboratories are promptly notified so that corrective actions can be initiated and problems can be resolved. QC data are stored in an easily retrievable manner so that they can be related to the analytical results they support.
The components of the review process are as follows.
Verification and validation of environmental data are performed as
components of the data collection process, which includes planning,
sampling, analysis, and data review. Verification and validation of
field and analytical data collected for environmental monitoring and
restoration programs are necessary to ensure that data conform with
applicable regulatory and contractual requirements. Validation of field
and analytical data is a technical review performed to compare data with
established quality criteria to ensure that data are adequate for
intended use. The extent of project data verification and validation
activities is based upon project-specific requirements.
Over the years, the environmental data verification and data validation
processes used by ORR
environmental programs have evolved to meet continuing regulatory
changes and monitoring objectives. Procedures have been written to
document the processes. For routine environmental effluent monitoring
and surveillance monitoring, data verification activities may include
processes of checking whether (1) data have been accurately
transcribed and recorded, (2) appropriate procedures have been
followed, (3) electronic and hard-copy data show one-to-one
correspondence, and (4) data are consistent with expected trends.
For example, the requirements for self-monitoring of surface-water and
wastewater effluents under the terms of an
NPDES permit require the
permittee to conduct the analyses as defined in 40 CFR 136 and
to certify that the data reported in the monthly discharge monitoring
report are true and accurate.
Typically, routine data verification actions alone are sufficient to
document the truthfulness and accuracy of the discharge monitoring
report. For ER projects,
routine verification activities are more contractually oriented and
include checks for data completeness, consistency, and compliance
against a predetermined standard or contract.
Certain projects may perform a more thorough technical validation of the
data as mandated by the project's data quality objectives. For example,
sampling and analyses conducted as part of a remedial investigation to
support the CERCLA process
may generate data that are needed to evaluate risk to human health and
the environment, to document that no further remediation is necessary,
or to support a multimillion-dollar construction activity and treatment
alternative. In that case, the data quality objectives of the project
may mandate a more thorough technical evaluation of the data against
predetermined criteria. For example,
EPA has established functional
guidelines for validation of organic and inorganic data collected under
the protocol of the EPA's CLP.
These guidelines are used to offer assistance to the data user in
evaluating and interpreting the data generated from monitoring
activities that require CLP performance.
The validation process may result in identifying data that do not meet
predetermined QC criteria (in
flagging quantitative data that must be considered qualitative only) or
in the ultimate rejection of data from its intended use. Typical
criteria evaluated in the validation of CLP data include the percentage
of surrogate recoveries, spike recoveries, method blanks, instrument
tuning, instrument calibration, continuing calibration verifications,
internal standard response, comparison of duplicate samples, and sample
holding times.
Electronic data transfers from portable computers in the field and from
laboratory information management systems used by on-site and commercial
analytical laboratories to environmental data management systems have
greatly enhanced the efficiency of the review process. In addition, the
ongoing development of data-review software applications continues to
provide necessary tools for data review. For example, as groundwater
monitoring data are compiled, computer capabilities accomplish the
following tasks:
Irregularities in the laboratory results that are discovered through
this program are flagged and reviewed with the laboratory. If
corrections need to be made, the laboratory provides a revised
laboratory report. If a data point is found to be an outlier, it remains
flagged in the data base as information for the data user.
Continuing improvements are being made to computerized environmental
data management systems maintained by the Y-12 Plant,
ORNL, and the K-25 Site
to improve the functionality of the systems, to allow access by a wide
range of data users, and to integrate the mapping capabilities of a
geographic information system
(GIS) with the data bases
containing results of environmental monitoring activities.
Integration of compliance-monitoring data for the ORR with sampling and
analysis results from remedial investigations by the ER Division is a
function of the Oak Ridge Environmental Information System
(OREIS). OREIS is necessary
to fulfill requirements prescribed in both the
FFA and
TOA and to support data
management activities for all five facilities managed by
Energy Systems.
The FFA, a tripartite agreement between
DOE, EPA Region IV, and
the state of Tennessee, requires DOE to maintain one consolidated data
base for environmental data generated at DOE facilities on the ORR.
According to the FFA, the consolidated data base is to include data
generated pursuant to the FFA as well as data generated under federal
and state environmental permits. The TOA further defines DOE staff
obligations to develop a quality assured, consolidated data base of
monitoring information that will be shared electronically on a
near-real-time basis with the state staff.
OREIS is the primary component of the data management program for the ER
Program, providing consolidated, consistent, and well documented
environmental data and data products to support planning, decision
making, and reporting activities. OREIS provides a direct electronic
link of ORR monitoring and remedial investigation results to EPA
Region IV and the state of Tennessee DOE Oversight Division.8.3.2 External Quality Control
In addition to the internal programs, all Energy Systems
analytical laboratories are directed by DOE and are expected by EPA to
participate in external QA programs. The QA programs generate data that
are readily recognizable as objective packets of results. The external
QA programs typically consist of the Energy Systems laboratories
analyzing a sample of unknown composition provided by various QA
organizations. The organizations know the true composition of the sample
and provide the Energy Systems laboratories with a data report on
their analytical performance. The sources of these programs are
laboratories in EPA, DOE, and the commercial sector. Energy Systems
participates in ten such programs (Table 8.1).
The following sections describe the external QA programs in which
Energy Systems participates.8.3.2.1 Environmental Protection Agency Contract
Laboratory Program (CLP)
The Contract Laboratory Program
(CLP) is an EPA-administered
QA element used to evaluate laboratory analytical proficiency in
comparison with analyte and the current state of work. The program
operates from the Contract Laboratory Analytical Services Support office
at Alexandria, Virginia, in cooperation with the EPA regional offices.
This program evaluates laboratories for the determination of organic and
inorganic contaminants in aqueous and solid hazardous waste materials
and enforces stringent QA/QC requirements to ensure comparable data.
This program scores on additional criteria other than an
``acceptable-unacceptable'' evaluation of the measurement result. By the
CLP scoring algorithm, performance of 75% or better indicates
acceptable performance. Values below this score indicate that
deficiencies exist and that the participant has failed to demonstrate
the capability to meet the contract requirements.8.3.2.2 Water Supply Laboratory Performance Quality
Control Program
This program is administered by EPA and is used by the state of
Tennessee to certify laboratories for drinking water analysis. To
maintain a certification, a laboratory must meet a specified set of
criteria relating to technical personnel, equipment, work areas, QA/QC
operating procedures, and successful analysis of QA samples. In
addition, inclusion on the state of Tennessee's
UST approved listing may be
granted as a result of successful participation in this program.8.3.2.3 Water Pollution Performance Evaluation Quality
Control Program
This program is used by DOE to evaluate laboratories engaged in analysis
of polluted water samples at existing and former DOE sites. It is
administered by EPA in Cincinnati, Ohio (Region V). It is also used
by some states as part of their laboratory certification process.8.3.2.4 American Industrial Hygiene Association
Proficiency Analytical Testing Program
The American Industrial Hygiene Association
(AIHA) administers the
Proficiency Analytical Testing
(PAT) Program as part of its
AIHA accreditation process for laboratories performing analyses of
industrial hygiene air samples.8.3.2.5 EPA Discharge Monitoring Report Quality
Assurance Study
EPA conducts a national QA program in support of the
NPDES permits, and it is
mandatory for major permit holders. The EPA supplies the QA samples and
furnishes the evaluated results to the permittee, who is required to
report the results and any necessary corrective actions to the state or
regional coordinator.8.3.2.6 EPA Intercomparison Radionuclide Control
Program
The EPA Intercomparison Radionuclide Control Program is administered by
NERL-LV. Samples include water,
air filters, and milk. The state of Tennessee requires participation for
drinking-water certification of radionuclide analysis, and all sites are
involved. The NERL-LV program calculates a normalized standard deviation
for each laboratory based on all reported results. By their criteria,
any reported value above three standard deviations is considered
unacceptable.8.3.2.7 Environmental Lead Proficiency Analytical
Testing (ELPAT) Program
This program was established by AIHA in 1992 to evaluate analysis of
environmental lead samples in different matrices. The matrices evaluated
are paint, soil, and dust wipes. The participating laboratory analyzes
each matrix at four levels. In addition, a laboratory may request to
become accredited for lead analysis in this program. The Environmental
Lead Proficiency Analytical Testing Program
(ELPAT) is administered by
AIHA. 8.3.2.8 Mixed Analyte Performance Evaluation Program
(MAPEP)
The Mixed Analyte Performance Evaluation Program
(MAPEP ) is a program set
up by the DOE Radiological and Environmental Sciences Laboratory in
conjunction with the Laboratory Management Division of the Office of
Technology Development to evaluate analysis of mixed-waste samples. The
program is evaluated by Argonne National Laboratory.8.3.2.9 DOE Environmental Measurements Laboratory
(EML) Quality Assessment Program
Participation in the radionuclide Quality Assessment Program,
administered by DOE Environmental Measurements Laboratory
(EML) in New York, is required
by DOE Order 5400.1. Various matrices, such as soil, water, air
filters, and vegetation, are submitted semiannually for analysis of a
variety of radioactive isotopes. All matrices, except air filters, are
actual materials obtained from the environment at a DOE facility. A
statistical report is submitted to the sites by EML for each period.8.3.2.10 Proficiency Environmental Testing (PET)
Program
The Proficiency Environmental Testing
(PET) program is a service
purchased from an outside vendor and is used by all five
Energy Systems analytical laboratories and the DOE laboratory at
the Fernald, Ohio, facility, to meet the need for a QA program for all
environmental analyses. The samples are supplied by the commercial
company at two concentration levels (high and low). All data from each
of the six laboratories are reported to the supplier. The commercial
supplier provides a report on the evaluated data to the site QA/QC
managers. The report includes a percentage recovery of the referenced
value, deviation from the mean of all reported data, specific problems
in a site lab, and other statistical information. A corporate report is
also provided that compares the data from the Energy Systems
laboratories with those of other corporate laboratories.8.3.2.11 Quality Assessment Program for Subcontracted
Laboratories
Requirements for QC of Analytical Data for the ER Program
(Energy
Systems 1992b
) defines the basic requirements that laboratories must satisfy in
providing support to ORR ER
projects. Oversight of subcontracted commercial laboratories is
performed by Technical Subcontracting Office
(TSO) personnel, who conduct
on-site laboratory reviews and monitor the performance of all
subcontracted laboratories.8.4 DATA MANAGEMENT, VERIFICATION, AND VALIDATION