Chapter One Table of Contents
Chapter One Table of Contents
Chapter One Table of Contents
�
TABLE OF CONTENTS
�
Section � Page
1.0 INTRODUCTION � 1
�
1.1 � PLANNING 2
�
1.2 � IMPLEMENTATION 9
�
1.3 � ASSESSMENT 11
�
1.4 � REFERENCES AND SOURCES FOR ADDITONAL INFORMATION ON 15
�
PROJECT QUALITY ASSURANCE AND CONTROL
�
1.5 � GLOSSARY 16
�
Figure
For a summary of changes in this version from the previously published Chapter One, please
see Appendix A at the end of this document.
1.0 INTRODUCTION
The goal of this chapter is to provide an understanding of environmental data and the need
for quality. EPA has developed numerous guidance documents on quality assurance. This
chapter is not intended to summarize the previously developed EPA guidance. Instead, this
chapter will provide familiarity with regulations and guidance relating to QA and where to find them.
Regulations promulgated under the Resource Conservation and Recovery Act (RCRA) of
1976, as amended; require the collection and use of environmental data by regulated entities. In
addition, organizations often collect and use environmental data for decision making. Given the
significant decisions to be made based on environmental data, it is critical that the data are of
sufficient quantity and quality for their intended use and can support decisionmaking based on
sound science.
In response to the need for quality data, it is recommended that all parties follow a structured
system of data quality assurance and quality control (QA/QC). In addition, some of the RCRA
regulations include specific requirements for ensuring data quality.
This chapter provides general guidance intended to ensure data are of sufficient quality for
their intended use. Its intended audience is any entity, government or private party that may be
collecting environmental data. It is designed to support the efforts of those responsible for
preparing and reviewing project planning documents such as Quality Assurance Project Plans
(QAPPs), those involved in implementing and assessing data collection and generation activities in
the field or laboratory, and those who use the data for decisionmaking.
Due to the diversity of data collection efforts, it is not possible to provide all details necessary
to meet the needs of all members of the intended audience. However, EPA has developed a
variety of detailed QA guidance documents that are incorporated into this Chapter by reference.
This series of quality systems documents can be accessed on the EPA's Quality web site at:
http://www.epa.gov/quality. These documents describe in detail EPA policies and procedures for
planning, implementing and assessing the effectiveness of quality systems.
EPA's quality system comprises three structural levels: policy, organization/program, and
project. This document addresses quality at the project level of the system, including technical
aspects of analytical method quality assurance (QA) and quality control (QC). Entities which
desire guidance on the other two structural levels (policy and organization/program levels) can
access such guidance at the aforementioned EPA quality web site.
A project's life cycle under EPA's quality system has three phases: planning, implementation,
and assessment. This chapter is organized into these three phases. Additionally, Figure 1 is
provided, and illustrates this process.
Additionally, this chapter contains general project QC guidance to be used with the
subsequent chapters and methods in this manual. It should be noted that several methods (e.g.,
SW846 Update V ONE 1 Revision 2
July 2014
Method 8000) also contain general QC criteria and guidance that pertain to the individual methods
referenced therein (e.g., Methods 8081, 8082, 8260 and 8270). Individual methods may also
contain QC criteria specific only to that method. The QC criteria in the general methods take
precedence over chapter QC criteria. Methodspecific QC criteria take precedence over general
method QC criteria.
1.1 PLANNING
Planning, the first phase of a project’s life cycle, involves the development of project
objectives and acceptance or performance criteria using a systematic process. Data quality
objectives (DQOs) and a sampling and analysis design are established to generate data of an
appropriate type, quality and quantity to meet project objectives. The final output of this phase is
a planning document, such as a QAPP, and/or a sampling and analysis plan (SAP) or a waste
analysis plan (WAP).
This section provides guidance on activities and concepts that EPA recommends to be used
or considered during the planning phase, when appropriate to a specific project.
Systematic planning is a process designed to ensure that the level of detail in planning is
commensurate with the importance and intended use of the work and the availability of resources
to accomplish it. The ultimate goal of systematic planning is to ensure collection of the
appropriate type, quantity, and quality of data to support decisions with acceptable confidence.
Following is a summary of EPA’s Agencywide guidance on systematic planning. More detail can
be found in the EPA Quality Manual for Environmental Programs (CIO2105P010).
• Description of the project goals and objectives (i.e., what is trying to be accomplished
by performing this project)
• Identification of the type (e.g., individual data points to be used to estimate risk at a site,
multipoint composites to be used to evaluate the average concentration in a decision
unit), quantity and quality (e.g., screening for the presence/absence of an analyte,
definitive data supported by all method specific QC results) of data needed. Be specific
on what kind of analytical result will be needed to make a decision, whether the
collected results need to be comprehensive and meet well defined DQOs or are
merely for screening purposes to make a presence/absence decision?
• Description of how, when, and where the data will be obtained, and identification of any
constraints on data collection.
• Description of how acquired data will be analyzed (i.e., field and/or laboratory),
evaluated, and assessed against performance criteria. If statistical assumptions are
made as part of the planning process, the assessment must discuss how the
assumptions will be verified as accurate, and what actions will be taken if the statistical
assumptions are not supported by the data.
Planners should also recognize that existing data (i.e., secondary data) can be useful in supporting
decision making. Secondary data can provide valuable information to help design a plan for
collecting new data, while also lowering the cost of future data collection efforts. However, the
limitations on using any secondary data must be clearly understood and documented. For
example, secondary data must be examined to ensure that their quality is acceptable for a given
application. Combining secondary data with current data can be a complex operation and should
be undertaken with care. Sometimes, statistical expertise is necessary to evaluate both data sets
before they can be combined. If combining data sets, make sure historical data use is appropriate
in type and quality to the current project.
1.1.2 DQOs
The DQO process can be applied to any study, regardless of its size. While there is no
regulatory obligation to use the DQO process, it is the recommended planning approach for most
EPA data collection activities. The depth and detail of DQO development will depend on the study
objectives. The DQO process is particularly applicable to a study in which multiple decisions must
be reached. By using the DQO process, the planning team can clearly separate and delineate
data requirements for each decision to be made or question to be answered. It consists of seven
planning steps that are summarized below.
The purpose of Step 1 is to clearly define the problem that has created the need for
the study. In describing the problem, especially for more complex sites, it is often useful to
include a conceptual site model (CSM). The CSM is a threedimensional "picture" of site
conditions at a discrete point in time that conveys what is known or suspected about the
facility, including releases, release mechanisms, contaminant fate and transport, exposure
pathways, potential receptors, and risks.
The purpose of Step 2 of the DQO process is to identify the key questions that need
to be answered in order to resolve the problem(s) identified in Step 1. Step 2 should also
identify any actions that may be taken based on study results. The goals of the study and
the alternative actions are then combined to form decision statement(s) that will resolve the
problem. A decision statement defines which of the identified alternative actions will be
pursued depending on the outcomes of the study.
The purpose of Step 3 of the DQO process is to identify the information needed to
resolve the decision statement. This may include, but is not limited to:
The purpose of Step 4 of the DQO process is to define the spatial and temporal
boundaries for the data collection design, including where samples will be collected. Spatial
boundaries describe the physical area (i.e., horizontal and vertical boundaries) of the study.
They can include geographic area or volume of material. Temporal boundaries include both
the period of time the data collection effort will represent and the timeframe to which the
decision will apply.
The purpose of Step 5 of the DQO process is to consider the outputs from Steps 1
4 and develop “If..., then... else” decision rules that unambiguously state which of the
alternative actions identified in Step 2 will be pursued. These if/then/else decisions should
be formulated to be dependent on how the results of the study compare to an established
action level.
The purpose of Step 6 of the DQO process is to set limits on decision errors, and to
document those limits. For judgmental and random samples, Step 6 should examine
consequences of making incorrect decisions, and place acceptable limits on the likelihood of
making decision errors. For random samples, Step 6 should specify any statistical
hypothesis to be considered and all applicable statistical tests that will be used to assess the
data.
The purpose of Step 7 of the DQO process is to develop the data collection plan that
will satisfy the objectives presented in Steps 1 through 6. RCRA Waste Sampling Draft
Technical Guidance, dated August 2002, provides guidance that may be used during
sampling design development. EPA also developed a guidance document called Guidance
for Choosing a Sampling Design for Environmental Data Collection (QA/G5S), to specifically
provide the information needed to carry out step 7 and develop a sampling design.
The proposed plan should be the most resourceeffective data collection design that
meets the previously identified performance or acceptance criteria. The plan for obtaining
data is documented in detail by developing a project QAPP as per EPA Requirements for
1.1.3 Development of QAPPs, Waste Analysis Plans (WAPs) and Sampling and Analysis
Plans (SAPs)
Documentation of planning processes and outcomes are critical to the:
Two types of planning documents are discussed in this section. Section 1.1.3.1 discusses
QAPPs, which are a key output of the systematic planning process. Section 1.1.3.2 discusses
WAPs and SAPs.
1.1.3.1 QAPPs
The primary purpose of the QAPP is to present the data collection activities to be
implemented, including all necessary QA and QC, to ensure that all data produced are of
known and documented quality, and that the data will satisfy the stated performance criteria.
When preparing a QAPP, a graded approach should be used to determine the level of
detail needed. This will ensure that the level of information presented is consistent with the
intended use of the results and the degree of confidence needed in the quality of the results.
The QAPP should be detailed enough to provide a clear description of every aspect of the
project, from site history through assessment of the planned data collection. At a minimum,
the QAPP should provide sufficient detail to demonstrate that:
• the project technical and quality objectives are identified and agreed upon;
• assessment procedures are sufficient for confirming that data of the type and
quality needed and expected are obtained; and
• any limitations on the use of the data can be identified and documented.
• Data Generation and Acquisition The elements in this group address all
aspects of project design and implementation, including the numbers, types,
and locations of all samples to be collected; the rationale for why the proposed
data collection effort will be sufficient to address the study objectives; all
sampling, subsampling and analytical procedures to be followed (i.e., both
sample preparation as well as determinative procedures); QC requirements for
all applicable field and laboratory procedures including the data quality
indicators (DQIs) discussed below in Section 1.1.4; instrument calibration and
maintenance for both field and laboratory equipment; use of secondary data;
and data management. Implementation of these elements ensures that
appropriate methods for sampling, analysis, data handling, and QC activities
are employed and properly documented.
• Assessment and Oversight The elements in this group address the activities
for assessing the effectiveness of project implementation and associated QA
and QC activities. The purpose of assessment is to ensure that the QA Project
Plan is properly implemented as prescribed.
• Data Validation and Usability The elements in this group address the QA
activities that occur after data collection or generation is completed. These
elements address how data will be reviewed, verified and validated as well as
how data will be assessed and reconciled with the project objectives.
While most QAPPs will describe project or task specific activities, there may be occasions
when a generic QAPP may be more appropriate. A generic QAPP addresses the general,
common activities of a program that are to be conducted at multiple locations or over a long
period of time. For example, a generic QAPP may be useful for a large monitoring program
that uses the same methodology at different locations. A generic QAPP describes, in a
single document, the information that is not site or timespecific but applies throughout the
program. Applicationspecific information is then added to the approved QAPP, either in the
form of a sitespecific QAPP, QAPP Addendum, or SAP.
In certain cases, WAPs or SAPs are required by a RCRA regulation. For example,
WAPs are required as part of a permit application. Where WAPs or SAPs are required by
regulation, the applicable regulations should be reviewed to ensure that the content and
format requirements for the WAP or SAP are understood. Additionally, it should be noted
that EPA has prepared various guidance documents to assist in preparing WAPs and SAPs
that meet various regulatory requirements. Examples of these guidance documents include
the following:
A QAPP may also be prepared, and required, as a supplement to any WAP or SAP. It
should also be noted that if a WAP or SAP is not required by regulation, QAPPs can be
prepared such that they present sufficient detail to cover both the QAPP and WAP or SAP in
a single document. If a WAP or SAP is prepared along with a QAPP, it is common for these
documents to reference one another for necessary information. To enhance the usability of
QAPPs and WAPs/SAPs, references between the documents should be specific, providing
the full document name, section number, subsection, and page number.
1.1.4.1 Precision
1.1.4.2 Accuracy
1.1.4.3 Representativeness
Comparability expresses the degree of confidence with which one data set can be
compared to another. It is dependent upon the proper design of the sampling program and
will be satisfied by ensuring that the approved plans are followed and that proper sampling
and analysis techniques are applied. Further, when assessing comparability, data sets
should be of known and documented quality.
1.1.4.5 Completeness
1.1.4.6 Bias
1.1.4.7 Reproducibility
1.1.4.8 Repeatability
1.1.4.9 Sensitivity
1.2 IMPLEMENTATION
Implementation is the second phase of the project life cycle. The implementation phase includes
the following steps:
SOPs are written documents that describe, in detail, the routine procedures to be followed for
a specific operation, analysis, or action. Please note, in most cases, referencing a given method
or method number is not sufficient, unless the method is performed exactly as written (e.g., a
methoddefined parameter). Laboratories must have an SOP to demonstrate that their procedure
meets the conditions of the referenced method. Information on how to prepare SOPs can be
found in EPA’s Guidance for the Preparation of Standard Operating Procedures (QA/G6), dated
April 2007.
SOPs enable tasks to be accomplished reproducibly, even if there are changes in the
personnel performing them. Consistent use of an approved SOP ensures conformance with
organizational practices, reduction in the frequency of errors, and improved data comparability and
defensibility. SOPs also serve as resources for training and for ready reference and
documentation of proper procedures.
The SOPs, and/or procedures described in the QAPP, must be followed for all project
activities from sample collection to validation. If projectspecific modifications to SOPs are
needed, they must also be presented in the QAPP. Field and laboratory SOPs should be
provided for the areas discussed below, as applicable:
• Sample Custody SOPs that describe sample receipt and handling; sample storage;
sample security and tracking; and holding times
• Sample collection
• Analytical Method SOPs, including subsampling, sample preparation/cleanup,
calibration, QC, and analysis
• Reagent/Standard Preparation and Traceability SOPs
• Equipment Calibration and Maintenance SOPs
• Corrective Action SOPs
• Data Reduction SOPs
• Data Reporting SOPs
• Records Management SOPs
• Waste Disposal SOPs
As indicated above, the QC associated with each method may be provided in appropriate SOPs.
The types of QC parameters to include in the SOPs and QAPP are defined in the glossary of this
chapter. Since the QC acceptance limits are frequently adjusted more often than SOPs are
revised, it may be impractical to update SOPs as these limits change. Therefore, field and
laboratory QC limits may need to be presented in QAPP tables rather than SOPs.
The data collection effort is performed in accordance with approved plans (e.g., QAPP).
The QAPP will present all proposed methods (i.e., including sampling, subsampling, preservation,
preparation, cleanup and determinative methods), as well as a rationale for why the proposed
methods are sufficient to meet the project DQOs.
The QAPP should also indicate whether the proposed analyses include methoddefined
parameters and/or whether the flexible approach to methods will be utilized. For the flexible
approach, the QAPP must also demonstrate that the proposed methods are adequate to reach the
study goals outlined in the DQOs.
Chapter Two of this manual entitled, "Choosing the Correct Procedure” provides additional
guidance regarding the selection of appropriate methods and method flexibility.
The QAPP should identify all personnel responsible for performing technical assessments,
the authority of the auditor (e.g., do they have the authority to stop work if significant deviations
from planning documents are found), and the planned audit frequency. Additionally, if technical
assessments are planned or required as part of a project, the QAPP should provide the applicable
While most technical assessments will be scheduled in advance with an organization, there
may be occasions when unannounced technical assessments are needed. However, an
unannounced audit may not reveal a representative picture of project activities if it occurs when the
project is not active, or activities that are not within the scope of the technical assessment are
occurring.
Technical assessments may be planned as part of the preaward activities, or throughout the
life of a project. For example, preaward audits, including performance evaluation (PE) samples,
are useful tools in determining the ability of a laboratory to perform the proposed analytical work.
Additionally, technical assessments may also be used as an investigative tool where problems
may be suspected. The EPA document Guidance on Technical Audits and Related Assessments
for Environmental Data Operations (QA/G7), dated January 2000, provides detailed information
on various types of technical audits and includes an example checklist for a technical systems
audit of a laboratory measurement system.
1.3 ASSESSMENT
Following planning (Sec. 1.1) and implementation (Sec. 1.2), assessment is the third and
final phase of the project data generation life cycle. The purpose of this phase is to evaluate
whether the data are of the type, quantity and quality needed to meet project DQOs. Assessment
can involve many different complex activities, including the use of statistical tools. This section
provides an introduction and overview of these data assessment activities.
Data verification and/or validation is the first step in the assessment process. Validation and
verification are defined in the EPA Guidance on Environmental Data Verification and Data
Validation (QA/G8), dated November 2002. According to QA/G8, verification is the process of
evaluating the completeness, correctness, and conformance/compliance of a data set against a
specified set of requirements. QA/G8 defines validation as an analyte and sample specific
process that extends the evaluation of data beyond data verification to determine its analytical
quality. For the purposes of this chapter, it should be noted that the meaning of verification and
validation is often program, organization and/or system specific. Further, these terms are used
interchangeably at times, and in other cases have very different meanings. Therefore, it is critical
that the procedures for verification and/or validation be clearly documented in the approved QAPP.
In general, data verification/validation involves the examination and review of project data
(e.g., field and laboratory data for all applicable QC measurements) to confirm that the sampling
and analysis protocols specified in the QAPP (and any other planning or contractual documents)
were followed. Data verification/validation also involves examining whether the data met the
method performance and acceptance criteria established in the project QAPP. When these
criteria are not met, data verification/validation procedures should include qualification of the data
(e.g., the J qualifier is commonly used to indicate estimated data and the R qualifier is commonly
used to indicate rejected data). The reasons for any failure to meet performance criteria also
In addition to QA/G8, EPA has published several references which provide more detailed
information on what should be included in data verification and validation procedures. These
include EPA regional and programmatic validation guidance documents that may apply to
particular projects. It is also common for EPA guidance from one program or region to be
modified for use in another area. While this may be acceptable, any modifications or changes to
the established validation procedures should be included in the approved QAPP.
Data verification/validation activities may be performed internally (e.g., when the laboratory
or data collector review their own data), or externally by an independent entity (i.e., a party that is
not associated with the data collection effort). Internal and external data verification/validation
procedures should be defined in the QAPP or other project documents.
Data quality assessment (DQA) is the second step in the assessment process. DQA is
often needed because data verification/validation alone is generally not sufficient to determine
whether a data set can be used for its intended purpose. Typically, the DQA follows the data
verification and/or validation step.
In general, the DQA should include an evaluation of overall trends or biases in the data and
associated QC results, as well as how the data may be affected. For random data sets, the DQA
should evaluate the validity of any statistical assumptions made during the planning phase. If the
statistical assumptions made during the planning phase are not supported by the data,
recommendations for corrective action should be presented. All DQA findings should be
summarized in a report.
Guidance documents available from EPA that discuss the DQA process include Data Quality
Assessment: A Reviewer’s Guide (QA/G9R), dated February 2006, and Data Quality
Assessment: Statistical Tools for Practitioners (QA/G9S), dated February 2006.
It is recommended that the data user refer to these guidance documents, because they
provide extensive information about DQA and the statistical tools that can be employed.
Although the DQA process described in QA/G-9R and QA/G-9S includes a significant
amount of statistical procedures, it should be noted that the DQA process is not only applicable to
random data sets, where statistics can be used to assess the data. A DQA should also be
performed on judgmental, or biased, data sets. The process for how any data set will be
assessed should be determined during the planning phase, and documented in the QAPP or other
site documents.
Systematic Planning
(e.g., DQO Process)
QA
Project Plan
Technical
Assessments
Conduct Study/
Experiment Standard
Operating
Procedures
Data Verification
& Validation
Planning Implementation
Assessment
Data Quality
Assessment
1. USEPA. Guidance on Systematic Planning Using the Data Quality Objectives Process, EPA
QA/G-4. Quality Staff, Office of Environmental Information, United States Environmental
Protection Agency, Washington, D.C. February 2006.
2. USEPA. EPA Quality Manual for Environmental Programs CIO 2105-P-01-0. Office of
Environmental Information Quality Staff, United States Environmental Protection Agency,
Washington, D.C. May 5, 2000.
3. USEPA. Guidance on Choosing a Sampling Design for Environmental Data Collection, EPA
QA/G-5S. Quality Staff, Office of Environmental Information, United States Environmental
Protection Agency, Washington, D.C. December 2002.
4. USEPA. Guidance on Environmental Data Verification and Data Validation, EPA QA/G-8.
Quality Staff, Office of Environmental Information, United States Environmental Protection
Agency, Washington, D.C. November 2002.
5. USEPA. Data Quality Assessment: A Reviewer’s Guide, QA/G-9R. Quality Staff, Office
of Environmental Information, United States Environmental Protection Agency, Washington,
D.C. February 2006.
6. USEPA. Data Quality Assessment: Statistical Tools for Practitioners, QA/G-9S. Quality
Staff, Office of Environmental Information, United States Environmental Protection Agency,
Washington, D.C. February 2006.
7. USEPA. Guidance for Preparing Standard Operating Procedures, EPA QA/G-6. Quality
Staff, Office of Environmental Information, United States Environmental Protection Agency,
Washington, D.C. April 2007.
8. USEPA. RCRA Waste Sampling Draft Technical Guidance. Office of Solid Waste, United
States Environmental Protection Agency, Washington, D.C. August 2002.
9. USPEA. Guidance for Quality Assurance Project Plans, EPA QA/G-5. Quality Staff, Office
of Environmental Information, United States Environmental Protection Agency, Washington,
D.C. December 2002.
10. USEPA. EPA Requirements for Quality Assurance Project Plans, EPA QA/R-5. Quality
Staff, Office of Environmental Information. United States Environmental Protection Agency,
Washington, D.C. March 2001.
11. USEPA. RCRA Delisting Program Guidance Manual for the Petitioner. United States
Environmental Protection Agency, March 23, 2000.
12. USEPA. Waste Analysis at Facilities That Generate, Treat, Store and Dispose of
Hazardous Wastes, ECDIC 2002-011. Office of Waste Programs Enforcement, United
States Environmental Protection Agency. April 1994.
13. USEPA. Guidance on Technical Audits and Related Assessments for Environmental Data
Operations, QA/G-7. Quality Staff, Office of Environmental Information, United States
Environmental Protection Agency, Washington, D.C. January 2000.
Also see the following for a glossary of quality-related terms developed by EPA:
http://www.epa.gov/fem/pdfs/Env_Measurement_Glossary_Final_Jan_2010.pdf
ANALYTICAL BATCH A group of samples, including quality control samples, which are
processed together using the same method, the same lots of reagents,
and at the same time or in continuous, sequential time periods.
Samples in each batch should be of similar composition and share
common internal quality control standards. For QC purposes, if the
number of samples in a batch is limited to 20; laboratory QC samples
are not included in the batch count.
B = (xs - xu ) - K
where:
CALIBRATION BLANK A calibration blank is a sample of analyte-free media that can be used
along with prepared standards to calibrate the instrument. A
calibration blank may also be used to verify absence of instrument
contamination (e.g., initial calibration blank and continuing calibration
blank).
CALIBRATION CHECKS Calibration check analyses are used to assess calibration drift and
memory effects over time for each analytical system. These analyses
may include zero, span (low and high) to cover the full calibration
range, and mid-range checks, depending on the method.
DATA QUALITY The quantitative statistics and qualitative descriptors that are used to
INDICATORS (DQIs) interpret the degree of acceptability or utility of data to the user. The
principal indicators of data quality are precision, bias, accuracy,
representativeness, comparability, completeness, and sensitivity.
DATA QUALITY Qualitative and quantitative statements derived from the DQO Planning
OBJECTIVES (DQOs) Process that clarify the purpose of the study, define the most
appropriate type of information to collect, determine the most
appropriate conditions from which to collect that information, and
specify tolerable levels of potential decision errors.
DATA VALIDATION The process of evaluating the available data against the project DQOs
to make sure that the objectives are met. Data validation may be very
rigorous, or cursory, depending on project DQOs. The available data
reviewed will include analytical results, field QC data, laboratory QC
data, and may also include field records.
EQUIPMENT BLANK A sample of analyte-free media which has been used to rinse the
sampling equipment. It is collected after completion of
decontamination and prior to sampling at a location. This blank is
useful in documenting adequate decontamination of sampling
equipment.
FIELD BLANK Field blanks include any sample submitted from the field that is
identified as a blank. These include trip blanks, rinsates, equipment
blanks, etc.
FIELD DUPLICATES Field duplicates are useful in documenting the precision of the
sampling process. Field duplicates are used to assess improper
homogenization of the samples in the field; reproducibility of sample
preparation and analysis; and, heterogeneity of the matrix.
FIELD SPLIT SAMPLES A type of field duplicate where the sample is homogenized and then
divided into two or more aliquots so that variability can be evaluated,
(i.e., often between laboratories or methods). Homogenization may
have an impact on sample integrity for some sample types (e.g., VOCs
INTERNAL STANDARD Internal standards may be spiked into prepared field samples and QC
samples (or sample extracts). Their recovery is generally used to
demonstrate the lack of severe matrix effect in the instrumental
analysis by setting criteria for the internal standard response in
comparison to a response in a sample with a known lack of matrix
effect (i.e., a standard). Internal standards are also used to account
for matrix effects and/or variability in instrument response by
normalizing the response of the target analytes and surrogates,
thereby decreasing measurement bias to the extent that their behavior
mimics that of the target analytes.
LOWER LIMIT OF The lowest point of quantitation which, in most cases, is the lowest
QUANTITATION (LLOQ) concentration in the calibration curve. The LLOQ is initially verified by
spiking a clean control material (e.g., reagent water, method blanks,
Ottawa sand, diatomaceous earth, etc.) at the LLOQ and processing
through all preparation and determinative steps of the method.
Laboratory-specific recovery limits should be established when
sufficient data points exist. Individual methods may recommend
procedures for verifying the LLOQ and acceptance limits for use until
the laboratory has sufficient data to determine acceptance limits.
LLOQs should be determined at a frequency established by the
method, laboratory’s quality system, or project.
MATRIX SPIKE Matrix spikes are aliquots of environmental samples to which known
concentrations of certain target analytes have been added before
sample preparation, cleanup, and determinative procedures have been
implemented. Matrix spike analysis would normally be included with
each preparation batch of samples processed. Under ideal
circumstances, the original, unspiked, field sample will be analyzed
first, to determine the concentration in the unspiked sample.
However, if this approach is not practical, the samples may be spiked
at the midpoint of the calibration range or at the same level as the LCS.
MATRIX SPIKE Matrix spike duplicates are additional replicates of matrix spike
DUPLICATES samples that are subjected to the sample preparation and analytical
scheme as the original sample. A matrix spike duplicate sample
would normally be included with each preparation batch of samples
processed. Analysis of spiked duplicate samples ensures a positive
value, allowing for estimation of analytical precision.
where:
_
x = the arithmetic mean of the i measurements, and
S = the square root of the variance of i measurements; and,
| − |
= ∗ 100
+
2
PROJECT Single or multiple data collection activities that are related through the
same planning sequence.
QUALITY CONTROL A sample made from standards or matrix and used to verify
SAMPLE acceptability of the results from preparation and/or analysis of a batch
of samples. Examples of laboratory quality control samples are
method blanks, laboratory duplicates, and laboratory control samples;
field quality control samples are field blanks, trip blanks, field
duplicates, and matrix spikes.
REAGENT GRADE Analytical reagent (AR) grade, ACS reagent grade, and reagent grade
are synonymous terms for reagents which conform to the current
specifications of the Committee on Analytical Reagents of the
American Chemical Society.
REAGENT WATER Water that has been generated by any method which would achieve
the performance specifications for ASTM Type II water. For organic
analyses, see the definition of organic-free reagent water.
STANDARD ADDITION The addition of a known amount of analyte to the sample in order to
determine the relative response of the detector to an analyte within the
sample matrix. The relative response is then used to assess either an
operative matrix effect or the sample analyte concentration.
TRIP BLANK A sample of analyte-free media taken from the laboratory to the
sampling site and returned to the laboratory unopened. Trip blanks
should be prepared at a frequency of one per day of sampling during
which samples are collected for volatile organic constituents (VOCs).
Trip blanks are prepared prior to the site visit at the time sample
containers are shipped to the site. The trip blank should accompany
the sampling kits throughout all the sample collection and transport
operations. This blank will not be opened during the sampling
activities and will be used to assess sample VOC contamination
originating from sample transport, shipping, or site conditions. A trip
blank is used to document contamination attributable to shipping and
field handling procedures.
1. The entire Chapter has been rewritten and reorganized to reflect changes in the EPA data
quality system approach.
2. The revision number was changed to two and the date published to July 2014.
3. This appendix was added to document changes made during the editorial process.
4. The document was updated to match the current SW-846 style guidelines.
5. Figure 1 was added based on information in EPA QA/G-8, November 2002, Figure 1.