EPA QA Handbook
EPA QA Handbook
EPA QA Handbook
InfluenceonAirFlow InfluenceonMonitoringSiteSelection
Slope/Valley Downward air currents at night and on cold
days; up slope winds on clear days when
valley heating occurs. Slope winds and
valley channeled winds; tendency toward
down-slope and down-valley winds;
tendency toward inversions
Slopes and valleys as special sites for air monitors
because pollutants generally are well dispersed;
concentration levels not representative of other
geographic areas; possible placement of monitor to
determine concentration levels in a population or
industrial center in valley
Water Sea or lake breezes inland or parallel to
shoreline during the day or in cold weather;
land breezes at night.
Monitors on shorelines generally for background readings
or for obtaining pollution data on water traffic
Hill Sharp ridges causing turbulence; air flow
around obstructions during stable
conditions, but over obstructions during
unstable conditions
Depends on source orientation; upwind source emissions
generally mixed down the slope, and siting at foot of hill
not generally advantageous; downwind source emissions
generally down washed near the source; monitoring close
to a source generally desirable if population centers
adjacent or if monitoring protects workers
Natural or
manmade
obstruction
Eddy effects Placement near obstructions may not produce
representative readings
Pollutant Considerations - A sampling site or an array of sites for one pollutant may be appropriate for
another pollutant species because of the configuration of sources, the local meteorology or the terrain.
Pollutants undergo changes in their compositions between their emission and their detection; therefore,
the impact of that change on the measuring system should be considered. Atmospheric chemical
reactions such as the production of O
3
in the presence of NO
x
and hydrocarbons (HCs) and the time delay
between the emission of NO
x
and HCs and the detection peak of O
3
values may require either a sampling
network for the precursors of O
3
and/or a different network for the actual O
3
measurement.
The success of the PAMS monitoring program is predicated on the fact that no site is unduly influenced
by any one stationary emissions source or small group of emissions sources. Any significant influences
would cause the ambient levels measured by that particular site to mimic the emissions rates of this
source or sources rather than following the changes in nonattainment area-wide emissions as intended by
the Rule. For purposes of this screening procedure, if more than 10% of the typical lower end
concentration measured in an urban area is due to a nearby source of precursor emissions, then the PAMS
site should be relocated or a more refined analysis conducted than is presented here. Detailed procedures
can be found in the PAMS Implementation Manual
11
.
None of the factors mentioned above stand alone. Each is dependent in part on the others. However, the
objective of the sampling program must be clearly defined before the selection process can be initiated,
and the initial definition of priorities may have to be reevaluated after consideration of the remaining
factors before the final site selection. While the interactions of the factors are complex, the site selection
problems can be resolved. Experience in the operation of air quality measurement systems; estimates of
air quality, field and theoretical studies of air diffusion; and considerations of atmospheric chemistry and
air pollution effects make up the required expertise needed to select the optimum sampling site for
obtaining data representative of the monitoring objectives.
11
http://www.epa.gov/ttn/amtic/pamsmain.html
QA Handbook Vol II, Section 6.0
Revision No: 0
Date: 05/13
Page 11 of 16
6.2.1 PAMS Site Descriptions
The PAMS network array for an area should be fashioned to supply measurements that will assist States
in understanding and solving ozone nonattainment problems. Table 6-4 describes the five site types
identified in the PAMS network. In 2007, EPA determined that the number of required PAMS sites could
be reduced. Only one Type 2 site is required per area regardless of population; Type 4 sites would not be
required; and only one Type 1 or one Type 3 site would be required per area.
Table 6-4 Site Descriptions of PAMS Monitoring Sites
Type# Meas.Scale Description
1 Urban Upwind and background characterization to identify those areas which are subjected to
overwhelming incoming transport of ozone. The #1 Sites are located in the predominant morning
upwind direction from the local area of maximum precursor emissions and at a distance sufficient to
obtain urban scale measurements. Typically, these sites will be located near the upwind edge of the
photochemical grid model domain.
2 Neighborhood Maximum ozone precursor emissions impacts located immediately downwind (using the same
morning wind direction as for locating Site #1) of the area of maximum precursor emissions and are
typically placed near the downwind boundary of the central business district (CBD) or primary area
of precursor emissions mix to obtain neighborhood scale measurements.
2a Neighborhood Maximum ozone precursor emissions impacts -second-most predominant morning wind
direction
3 Urban Maximumozoneconcentrationsoccurringdownwindfromtheareaofmaximumprecursor
emissions.Locationsfor#3Sitesshouldbechosensothaturbanscalemeasurementsareobtained.
Typically,thesesitesarelocated10to30milesfromthefringeoftheurbanarea
4 Urban Extremedownwindmonitoringoftransportedozoneanditsprecursorconcentrationsexitingthe
areaandwillidentifythoseareaswhicharepotentiallycontributingtooverwhelmingozone
transportintootherareas.The#4Sitesarelocatedinthepredominantafternoondownwind
directionfromthelocalareaofmaximumprecursoremissionsatadistancesufficienttoobtain
urbanscalemeasurements.Typically,thesesiteswillbelocatednearthedownwindedgeofthe
photochemicalgridmodeldomain.
There are three fundamental criteria to consider when locating a final PAMS site: sector analysis,
distance, and proximate sources. These three criteria are considered carefully by EPA when approving or
disapproving a candidate site for PAMS.
6.2.2 NCore Site Descriptions
NCore is a multi pollutant network that integrates several advanced measurement systems for particles,
pollutant gases and meteorology. Most NCore stations have been operating since the formal start of the
network on January 1, 2011. The NCore Network addresses the following objectives:
Timely reporting of data to the public by supporting AIRNow, air quality forecasting, and other
public reporting mechanisms;
Support for development of emission strategies through air quality model evaluation and other
observational methods;
Accountability of emission strategy progress through tracking long-term trends of criteria and
non-criteria pollutants and their precursors;
Support for long-term health assessments that contribute to ongoing reviews of the NAAQS;
QA Handbook Vol II, Section 6.0
Revision No: 0
Date: 05/13
Page 12 of 16
Compliance through establishing nonattainment/attainment areas through comparison with the
NAAQS;
Support to scientific studies ranging across technological, health, and atmospheric process
disciplines; and
Support to ecosystem assessments recognizing that national air quality networks benefit
ecosystem assessments and, in turn, benefit from data specifically designed to address ecosystem
analyses.
The NCore network began J an 1, 2011, consisting of 80 sites; 63 urban sites and 17 rural sites. For more
detailed information on each specific site, click on the "sites map" link and this will connect to each site's
Characterization Report.
NCore is both a repackaging and an enhancement of existing networks. The emphasis on the term Core
reflects a multi-faceted, multi-pollutant national network that can be complemented by more specific
efforts, such as intensive field campaigns to understand atmospheric processes, or personal and indoor
measurements to assess human exposure and health effects. The NCore network will replace the current
National Air Monitoring Station (NAMS) and leverages all of the major existing networks to produce an
integrated multi-pollutant approach to air monitoring.
Emphasis is placed on a backbone of multi-pollutant sites, continuous monitoring methods, and
measurement of important pollutants other than the criteria pollutants (e.g., ammonia and NOy).
When complete, NCore will meet a number of important data needs: improved flow and timely reporting
of data to the public, including supporting air quality forecasting and information systems such as
AIRNow; continued determination of NAAQS compliance; improved development of emissions control
strategies; enhanced accountability for the effectiveness of emission control programs; and more complete
information for scientific, public health, and ecosystem assessments. Structurally, NCore will establish
three levels of monitoring sites:
Level 1 a small number of research-oriented sites accommodating the greatest diversity of
instrumentation with specific targeted objectives, reasonably analogous to the current PM
Supersite program;
Level 2 the backbone network of approximately 75 long-term, nationwide multi-pollutant sites,
encompassing both urban (about 55 sites) and rural (about 20 sites) locations;
Level 3 sites focused primarily on specific pollutants of greatest concern (PM and O3), with as
few as one measured parameter. It is estimated that over 1,000 Level 3 sites will be part of
NCore.
Specific design criteria for NCore can be found in 40 CFR Part 58 Appendix D.
QA Handbook Vol II, Section 6.0
Revision No: 0
Date: 05/13
Page 13 of 16
6.3 Minimum Network Requirements
Rather than place tables for minimum monitoring site requirements in the Handbook (since they have a
tendency to change), the reader is directed to 40 CFR Part 58, Appendix D
12
of the most current
regulation to find the appropriate minimum monitoring network requirements.
6.4 Operating Schedules
NOTE: The reader should check the most current version of 40 CFR Part 58 to ensure the
schedules below have not changed.
For continuous analyzers, consecutive hourly averages must be collected except during:
1. periods of routine maintenance;
2. periods of instrument calibration, quality control checks or performance evaluation; or
3. periods or monitoring seasons exempted by the Regional Administrator.
For Pb manual methods, at least one 24-hour sample must be collected every 6 days except during
periods or seasons exempted by the Regional Administrator.
For PAMS VOC samplers, samples must be collected as specified in 40 CFR Part 58, Appendix D
Section 5. Area specific PAMS operating schedules must be included as part of the PAMS network
description and must be approved by the Regional Administrator.
For manual PM
2.5
samplers:
1. Manual PM
2.5
samplers at SLAMS stations- a 24-hour sample must be taken from midnight to
midnight (local time) to ensure national consistency and other than NCore stations, must operate
on at least a 1-in-3 day schedule at sites without a collocated continuously operating PM
2.5
monitor. For SLAMS PM
2.5
sites with both manual and continuous PM
2.5
monitors operating, the
monitoring agency may request approval for a reduction to 1-in-6 day PM
2.5
sampling or for
seasonal sampling from the EPA Regional Administrator. The EPA Regional Administrator may
grant sampling frequency reductions after consideration of factors, including but not limited to
the historical PM
2.5
data quality assessments, the location of current PM
2.5
design value sites, and
their regulatory data needs.
Required SLAMS stations whose measurements determine the design value for their area and that
are within plus or minus 10 percent of the NAAQS; and all required sites where one or more 24-
hour values have exceeded the NAAQS each year for a consecutive period of at least 3 years are
required to maintain at least a 1-in-3 day sampling frequency. A continuously operating FEM or
ARM PM
2.5
monitor satisfies this requirement. Required SLAMS stations whose measurements
determine the 24-hour design value for their area and whose data are within plus or minus 5
percent of the level of the 24-hour PM
2.5
NAAQS must have an FRM or FEM operate on a daily
schedule if that area's design value for the annual NAAQS is less than the level of the annual
PM
2.5
standard. A continuously operating FEM or ARM PM
2.5
monitor satisfies this requirement
unless it is identified in the monitoring agency's annual monitoring network plan as not
12
http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title40/40tab_02.tpl
QA Handbook Vol II, Section 6.0
Revision No: 0
Date: 05/13
Page 14 of 16
appropriate for comparison to the NAAQS. The national sampling schedule can be found on
AMTIC
13
.
2. Manual PM
2.5
samplers at NCore stations and required regional background and regional
transport sites must operate on at least a 1-in-3 day sampling frequency.
3. Manual PM
2.5
speciation samplers at CSN stations must operate on a 1-in-3 day sampling
frequency.
For PM
10
samplers, a 24-hour sample must be taken from midnight to midnight (local time) to ensure
national consistency. The minimum monitoring schedule for the site in the area of expected maximum
concentration shall be based on the relative level of that monitoring site concentration with respect to the
24-hour standard as illustrated in Figure 6.2. If the operating agency demonstrates by monitoring data
that during certain periods of the year conditions preclude violation of the PM
10
24-hour standard, the
increased sampling frequency for those periods or seasons may be exempted by the Regional
Administrator and permitted to revert back to once in six days. The minimum sampling schedule for all
other sites in the area remains once every six days.
Figure 6.2 Sampling schedule based on ratio to the 24-hour PM
10
NAAQS
For manual PM
102.5
samplers:
1. Manual PM
102.5
samplers at NCore stations must operate on at least a 1-in-3 day schedule at
sites without a collocated continuously operating federal equivalent PM
102.5
method that has been
designated in accordance with 40 CFR Part 53.
2. Manual PM
102.5
speciation samplers at NCore stations must operate on at least a 1-in-3 day
sampling frequency.
13
http://www.epa.gov/ttn/amtic/calendar.html
QA Handbook Vol II, Section 6.0
Revision No: 0
Date: 05/13
Page 15 of 16
For NATTS Monitoring, samplers must operateyear round and follow the national 1-in-6 day sampling
schedule.
6.5.1 Operating Schedule Completeness
Data required for comparison to the NAAQS have specific completeness requirements. These
completeness requirements generally start from completeness at hourly and 24-hour concentration values.
However, the data used for NAAQS determinations include 3-hour, 8-hour, quarterly, annual and multiple
year levels of data aggregation. Generally, depending on the calculation of the design value, EPA requires
data to be 75% complete. All continuous measurements come down to what is considered a valid hour
and currently all 24-hour estimates based on sampling (manual PM, Pb, TSP) are based on a 24-hour
sampling period. Table 6-5 provides the completeness goals for the various ambient air program
monitoring programs.
The data cells highlighted in Table 6-5 refer to the standards that apply to the specific pollutant. Even
though a highlighted cell lists the completeness requirement, CFR provides additional detail, in some
cases, on how a design value might be calculated with less data than the stated requirement. Therefore,
the information provided in Table 6-5 should be considered the initial completeness goal. Completeness
goals that are not highlighted, although not covered in CFR, are very important to the achievement of the
CFR completeness goals. So, for example, even though there is only an 8-hour ozone standard, its
important to have complete 1-hour values in order to compare to the 8-hour standard.
Table 6-5 Completeness Goals for Ambient Air Monitoring Data
Completeness Goals and Associated Standards (highlighted)
Pollutants 1-hour 3-hour 8-hour 24-hour Quarterly Annual
CO 45, 1 min. values 75% of
hourly values
75% of
hourly values
75% of hourly
values per quarter
O
3
45, 1 min. values 75% of
hourly values
SO
2
45, 1 min. values All 3 hours
75% complete
75% of
hourly values
75% of hourly
values per quarter
NO
2
45, 1 min. values 75% of hourly
values per quarter
PM
10
Cont 45, 1 min. values 18 Hours
PM
2.5
Cont. 45, 1 min. values 18 Hours
PM
10
Manual
23 Hours**
PM
2.5
Manual
23 hours 75% of
samples
Pb 23 Hours 3 mo avg
>75% of
monthly
means
PAMS 23 Hours
NATTS 23 Hours
CSN 23 Hours
** not defined in CFR
For continuous instruments, it is suggested that 45, 1-minute values be considered a valid hour. Therefore,
it is expected that 1-minute concentration values would be archived for a period of time (see statute of
limitations in Section 5). Since various QC checks take time to complete, (zero/span/1-point QC) it is
suggested that they be implemented in a manner that spans two hours (e.g., at 11:45 PM to 12:15 AM) in
order to avoid losing an hours worth of data.
QA Handbook Vol II, Section 6.0
Revision No: 0
Date: 05/13
Page 16 of 16
6.5.2 Monitoring Seasons
Most of the monitoring networks operate year round with the exception of PAMS and ozone monitoring.
PAMS - 40 CFR 58, Appendix D stipulates that PAMS precursor monitoring must be conducted annually
throughout the months of J une, J uly and August (as a minimum) when peak O
3
values are expected in
each area. Alternate precursor monitoring periods may be submitted for approval to the Administrator as a
part of the annual monitoring network plan.
Ozone - Since O
3
levels decrease significantly in the colder parts of the year in many areas, O
3
is required
to be monitored at SLAMS monitoring sites only during the ozone season as designated in the AQS
files on a State-by-State basis and described in 40 CFR Part 58, Appendix D
14
. Deviations from the O
3
monitoring season must be approved by the EPA Regional Administrator, documented within the annual
monitoring network plan, and updated in AQS.
6.5 Network Plan Reporting
The following two types of documents related to the monitoring network are required to be reported to
EPA. Additional information on these assessments can be found in 40 CFR Part 58.10
Annual Monitoring Network Plan
The monitoring organization shall submit to the Regional Administrator an annual monitoring network
plan which shall provide for the establishment and maintenance of an air quality surveillance system that
consists of a network of SLAMS monitoring stations including FRM, FEM, and ARM monitors that are
part of SLAMS, NCore stations, CSN stations, State speciation stations, SPM stations, and/or, in serious,
severe and extreme ozone nonattainment areas, PAMS stations, and SPM monitoring stations. The plan
shall include a statement of purposes for each monitor and evidence that siting and operation of each
monitor meets the requirements of appendices A, C, D, and E of this part, where applicable. The annual
monitoring network plan must be made available for public inspection for at least 30 days prior to
submission to EPA. These network plans are posted on AMTIC
15
5-Year Network Assessments
The monitoring organization shall perform and submit to the EPA Regional Administrator an assessment
of the air quality surveillance system every 5 years to determine, at a minimum, if the network meets the
monitoring objectives defined in 40 CFR part 58 Appendix D to this part, whether new sites are needed,
whether existing sites are no longer needed and can be terminated, and whether new technologies are
appropriate for incorporation into the ambient air monitoring network. The network assessment must
consider the ability of existing and proposed sites to support air quality characterization for areas with
relatively high populations of susceptible individuals (e.g., children with asthma) and, for any sites that
are being proposed for discontinuance, the effect on data users other than the agency itself, such as nearby
States and Tribes or health effects studies. For PM2.5, the assessment also must identify needed changes
to population-oriented sites. The State, or where applicable local, agency must submit a copy of this 5-
year assessment, along with a revised annual network plan, to the Regional Administrator.
14
http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title40/40tab_02.tpl
15
http://www.epa.gov/ttn/amtic/plans.html
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 1 of 16
7.0 The Sampling System
To establish the validity of ambient air monitoring data, it must be shown that:
the proposed sampling method complies with the appropriate monitoring regulations;
the equipment is accurately sited;
the equipment was accurately calibrated using correct and established calibration methods;
there is enough information from data quality indicators to assess data uncertainty;
samples are appropriately handled through proper chain of custody procedures, and
the organization implementing the data collection operation are qualified and competent.
For example, if the only reasonable monitoring site has a less than ideal location, the data collection
organization must decide whether a representative sample can be obtained at the site. This determination
should be recorded and included in the program's QAPP. Although after-the-fact site analysis may
suffice in some instances, good quality assurance techniques dictate that this analysis be made prior to
expending the resources required to collect the data.
The purpose of this section is to describe the attributes of the sampling system that will ensure the
collection of data of a quality acceptable for the Ambient Air Quality Monitoring Program. A sampling
system for the ambient air monitoring program will include aspects of:
siting,
the establishment of a monitoring station or platform for monitors/ samplers,
outfitting for electricity, HVAC, water etc.,
use of appropriate probe and inlet material,
setting up quality control systems, and
information management systems.
Information management systems will be described in Section
7.1 Monitor Placement
Final placement of the monitor at a selected site depends on physical obstructions and activities in the
immediate area, accessibility/availability of utilities and other support facilities in correlation with the
defined purpose of the specific monitor and its design. Because obstructions such as trees and fences can
significantly alter the air flow, monitors should be placed away from obstructions. It is important for air
flow around the monitor to be representative of the general air flow in the area to prevent sampling bias.
Detailed information on urban physiography (e.g., buildings, street dimensions) can be determined
through visual observations, aerial photography and surveys. Such information can be important in
determining the exact locations of pollutant sources in and around the prospective monitoring site areas.
Network designers should avoid sampling locations that are unduly influenced by down wash or ground
dust (e.g., a rooftop air inlet near a stack or a ground-level inlet near an unpaved road); in these cases, the
sample intake should either be elevated above the level of the maximum ground turbulence effect or
placed at a reasonable distance from the source of ground dust.
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 2 of 16
Depending on the defined monitoring objective, the monitors are placed according to exposure to
pollution. Due to the various physical and meteorological constraints discussed above, tradeoffs will be
made to locate a site in order to optimize representativeness of sample collection. The consideration
should include categorization of sites relative to their local placements. Suggested categories relating to
sample site placement for measuring a corresponding pollution impact are identified in Table 6-5.
Table 7-1 Monitoring Station Categories Relating to Sample Site Placement
StationCategory Characterization
A(groundlevel) Heavypollutantconcentrations,highpotentialforpollutantbuildup.Asite3to5m(1016ft)
frommajortrafficarteryandthathaslocalterrainfeaturesrestrictingventilation.Asampler
probethatis3to6m(1020ft)aboveground.
B(groundlevel) Heavypollutantconcentrations,minimalpotentialforapollutantbuildup.Asite3to15m
(1550ft)fromamajortrafficartery,withgoodnaturalventilation.Asamplerprobethatis3to
6m(1020ft)aboveground.
C(groundlevel) Moderatepollutantconcentrations.Asite15to60m(5200ft)fromamajortrafficartery.A
samplerprobethatis3to6m(1020ft)aboveground.
D(groundlevel) Lowpollutantconcentrations.Asite60>m(>200ft)foratrafficartery.Asamplerprobethatis
3to6m(1020ft)aboveground.
E(airmass) Samplerprobethatisbetween6and45m(20150ft)aboveground.Twosubclasses:(1)good
exposurefromallsides(e.g.,ontopofbuilding)or(2)directionallybiasedexposure(probe
extendedfromwindow).
F(sourceoriented) Asamplerthatisadjacenttoapointsource.Monitoringthatyieldsdatadirectlyrelatabletothe
emissionsource.
7.2 Environmental Control
7.2.1 Monitoring Station Design
State and local agencies should design their monitoring stations with the station operator in mind. Careful
thought to safety, ease of access to instruments and optimal work space should be given every
consideration. If the station operator has these issues addressed, then he/she will be able to perform their
duties more efficiently and diligently. Having the instruments in an area that is difficult to work in creates
frustration, prolongs downtime and may delay required maintenance (i.e., not cleaning manifolds because
they are too hard to get to). The goal is to optimize data collection and quality and it starts with designing
the shelter and laboratory around staff needs and requirements.
Monitoring stations may be located in urban areas where space and land are at a premium, especially in
large cities that are monitoring for NO
x
and CO. In many cases, the monitoring station is located in a
building or school that is gracious enough to allow an agency to locate its equipment. Sometimes, a storage
or janitorial closet is all that is available. However, this can pose serious problems. If the equipment is
located in a closet, then it is difficult for the agency to control the effects of temperature, humidity, light,
vibration and chemicals on the instruments. In addition, security can also be an issue if people other than
agency staff have access to the equipment. Monitoring organizations should give serious thought to
locating air monitoring equipment in stand-alone shelters with limited access, or modify existing rooms to
the recommended station design if funds and staff time are available.
In general, air monitoring stations should be designed for functionality and ease of access for operation,
maintenance and repair. In addition, the shelter should be rugged enough to withstand local weather
condition extremes. In the past, small utility trailers were the norm in monitoring shelters. However, in
some areas, this will not suffice. Recently, steel and aluminum storage containers are gaining wide
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 3 of 16
acceptance as monitoring shelters. It is recommended that monitoring stations be housed in shelters that
are fairly secure from intrusion or vandalism. All sites should be located in fenced or secure areas with
access only through locked gates or secure pathways. The shelters design dictates that they be insulated
(R-19 minimum) to prevent temperature extremes within the shelter. All structures should be secured to
their foundations and protected from damage during natural disasters. All monitoring shelters should be
designed to control excessive vibrations and prevent external light from falling on the instruments, and
provide 110/220 VAC voltage throughout the year. When designing a monitoring shelter, make sure that
enough electrical circuits are secured for the current load of equipment plus other instruments that may be
added later or audit equipment (e.g., NPAP/PEP). Every attempt should be made to reduce the
environmental footprint of shelters to make them as energy efficient as possible. Some possibilities include
venting of excess heat of monitoring instruments to the outside in summer months, use of energy efficient
fixtures and HVAC systems, and ensuring that the amount of space devoted to the monitors is not excessive
(remembering that space is needed at times for additional QA equipment). Figure 7.1 represents one
shelter design that has proven adequate.
The first feature of the shelter is that
there are two rooms separated by a
door. The reasons for this are two-
fold. The entry and access should be
into the computer/data review area.
This allows access to the site without
having to open the room that houses
the equipment. It also isolates the
equipment from cold/hot air that can
come into the shelter when someone
enters. Also, the Data Acquisition
System (DAS)/data review area is
isolated from the noise and vibration
of the equipment. In some cases
vibration and noise can be reduced by
locating pumps outside the shelter (if
appropriate weather conditions exist).
This area can be a place where the
operator can print data, and prepare samples for the laboratory. This also gives the operator an area where
cursory data review can take place. If something is observed during this initial review then possible
problems can be corrected or investigated at that time. The DAS can be linked through cables that travel
through conduit into the equipment area. The conduit is attached to the ceiling or walls and then dropped
down to the instrument rack.
The air conditioning/heating unit should be mounted to heat and cool the equipment room. When
specifying the unit, make sure it will cool the room on the warmest days and heat on the coldest days of the
year. Also, make sure the electrical circuits are able to carry the load. If necessary, keep the door closed
between the computer and equipment room to lessen the load on the heating or cooling equipment.
All air quality instrumentation should be located in an instrument rack or equivalent. The instruments and
their support equipment are placed on sliding trays or rails. By placing the racks away from the wall, the
rear of the instruments are accessible. The trays or rails allow the site operators access to the instruments
without removing them from the racks. Most instrument vendors offer sliding rails as an optional purchase.
If several instruments are placed in an instrument rack, the labeling of all power cords, sample and
Figure 7.1 Example Design for Shelter
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 4 of 16
exhaust lines will help to identify where lines and inlets are and it will help when it comes time to
trace things back to an instrument.
7.2.2 Sampling Environment
A proper sampling environment demands control of all physical parameters external to the samples that
might affect sample stability, chemical reactions within the sampler, or the function of sampler
components. The important parameters to be controlled are summarized in Table 7-2.
Table 7-2 Environment Control Parameters
Parameter Source of specification Method of Control
Instrument vibration Manufacturers specifications Design of instrument housings, benches, etc., per
manufacturers specifications. Locate pumps outside it
appropriate conditions exist.
Light Method description or
manufacturers specifications
Shield chemicals or instruments that can be affected by
natural or artificial light
Electrical voltage Method description or
manufacturers specifications
Constant voltage transformers or regulators; separate
power lines; isolated high current drain equipment such
as hi-vols, heating baths, pumps from regulated circuits
Temperature Method description or
manufacturers specifications
Regulated air conditioning system 24-hour temperature
recorder; use electric heating and cooling only
Humidity Method description or
manufacturers specifications
Regulated air conditioning system; 24-hour
temperature recorder
With respect to environmental temperature for designated analyzers, most analyzers have been tested and
qualified over a temperature range of 20
o
C to 30
o
C; few are qualified over a wider range. When one is
outfitting a shelter with monitoring equipment, it is important to recognize and accommodate the
instrument with the most sensitive temperature requirement. The temperature range specifies both the
range of acceptable operating temperatures and the range of temperature change which the analyzer can
accommodate without excessive drift. The latter, the range of temperature change that may occur
between zero and span adjustments, is the most important. EPA suggests that shelters be maintained
within a standard deviation (SD) of +2
o
C over a 24 hour period. The SD can be assessed using 1- hour
shelter temperature estimates.
To accommodate energy conservation regulations or guidelines specifying lower thermostat settings,
designated analyzers located in facilities subject to these restrictions may be operated at temperatures
down to 18
o
C, provided the analyzer temperature does not fluctuate by more than 10
o
C between zero and
span adjustments. Operators should be alert to situations where environmental temperatures might fall
below 18
o
C, such as during night hours or weekends. HVAC system must be able to keep shelters
temperatures above 18
o
C.
Shelter temperatures above 30
o
C also occur, due to temperature control equipment that is malfunctioning,
lack of adequate power capacity, or shelters of inadequate design for the environmental conditions.
Occasional fluctuations above 30
o
C may require additional assurances that data quality is maintained.
Sites that continually have problems maintaining adequate temperatures may necessitate additional
temperature control equipment or rejection of the area as a sampling site. If this is not an option, a waiver
to operate beyond the required temperature range should be sought with the EPA Regional Office, if it
can be shown that the site can meet established data quality requirements. In addition, when providing
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 5 of 16
cooling to shelters, care should be taken to avoid cool air blowing directly on monitors.
In order to detect and correct temperature fluctuations, it is suggested that a 24-hour temperature recorder
that collects hourly values (minimally), be located in the shelter. The device should be accurate to within
+2
o
C and checked every 6 months by a NIST traceable standard. These recorders can be connected to
data loggers and should be considered official documentation that should be filed (see Section 5). Many
vendors offer these type of devices. Usually they are thermocouple/thermistor devices of simple design
and are generally very sturdy. Reasons for using electronic shelter temperature devices are two-fold: 1)
through remote interrogation of the DAS, the agency can tell if values collected by air quality instruments
are valid, and 2) that the shelter temperature is within a safe operating range if the air
conditioning/heating system fails.
7.3 Sampling Probes And Manifolds
7.3.1 Design of Probes and Manifolds for Automated Methods
Some important variables affecting the sampling manifold design are the diameter, length, flow rate,
pressure drop, and materials of construction. With the development of NCore precursor gas monitoring,
various types of probe/manifold designs were reviewed. This information can be found in the Technical
Assistance Document (TAD) for Precursor Gas Measurements in the NCore Multi-pollutant Monitoring
Network
1
and is also included in Appendix F of this Handbook.
1
http://www.epa.gov/ttn/amtic/files/ambient/monitorstrat/precursor/tadversion4.pdf
Of the probe and manifold material looked at over the years, only Pyrex
have been
found to be acceptable for use as intake sampling lines for all the reactive gaseous pollutants.
Furthermore, the EPA has specified borosilicate glass or FEP Teflon
), FEP Teflon
is
unacceptable as the probe material
because of VOC adsorption and
desorption reactions on the FEP Teflon
.
Borosilicate glass, stainless steel, or its
equivalent, are acceptable probe
materials for VOC and carbonyl
sampling. Care must be taken to ensure
that the sample residence time is kept to
20 seconds or less (see below).
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 6 of 16
When determining how to set up a
sampling station with regards to probes,
inlets and sampling material, monitoring
organization have the option of:
1) using individual Teflon
sampling
lines (Fig7.2) which may access the
ambient air through one port (with a
number of individual lines) but each line
would run directly to an analyzer.
2) using glass manifolds (Fig 7.3) which
allow for ambient air to enter from a
single inlet, collect in the manifold and
then be distributed through manifold
outlet ports in individual analyzers.
Either method is appropriate and it may
depend on the number of analyzers at the
site, how the shelter is configured for
access, and what resources are available for maintenance and cleaning.
Residence Time Determination
No matter how nonreactive the sampling probe material may be, after a period of use, reactive particulate
matter is deposited on the probe walls. Therefore, the time it takes the gas to transfer from the probe inlet
to the sampling device is critical. Ozone, in the presence of nitrogen oxide (NO), will show significant
losses even in the most inert probe material when the residence time exceeds 20 seconds. Other studies
indicate that a 10 second or less residence time is easily achievable.
Residence time is defined as the amount of time that it takes for a sample of air to travel from the opening
of the inlet probe (or cane) to the inlet of the instrument and is required to be less than 20 seconds for
reactive gas monitors. The residence time of pollutants within the sampling manifold is also critical. It is
recommended that the residence time within the manifold and sample lines to the instruments be less than
10 seconds (of the total allowable 20 seconds). If the volume of the manifold does not allow this to occur,
then a blower motor or other device (vacuum pump) can be used to decrease the residence time. The
residence time for a manifold system is determined in the following way. First the volume of the cane,
manifold and sample lines must be determined using the following equation:
Total Volume = Cv +Mv + Lv
Where:
Cv =Volume of the sample cane and extensions, cm
3
Mv =Volume of the sample manifold and trap, cm
3
Lv =Volume of the instrument lines, cm
3
Each of the components of the sampling system must be measured individually. To measure the volume
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 7 of 16
of the components, use the following calculation:
V = pi * (d/2)
2
* L
Where:
V =volume of the component, cm
3
pi =3.14159
L =Length of the component, cm
d =inside diameter, cm
Once the total volume is determined, divide the volume by the flow rate of all instruments. This will give
the residence time.
It has been demonstrated that there are no significant losses of reactive gas (O
3
) concentrations in
conventional 13 mm inside diameter sampling lines of glass or Teflon if the sample residence time is 10
seconds or less. This is true even in sample lines up to 38 m in length, which collect substantial amounts
of visible contamination due to ambient aerosols. However, when the sample residence time exceeds 20
seconds, loss is detectable, and at 60 seconds the loss is nearly complete.
The air flow through the manifold must not be so great as to cause the pressure inside the manifold to
be more than one inch of water below ambient. These last two conditions are in opposition to each
other, but can be assessed as follows. Construct the manifold. Use a pitot tube to measure the flow of
the sample inside the manifold. At the same time, attach a water manometer to a sampling port. Turn
on the blower and measure the flow rate and the vacuum. (Remember to allow for the air demand of
the instrumentation). Adjust the flow rate to fit between these two parameters. If this is impossible,
the diameter of the manifold is too small.
Placement of tubing on the Manifold: If the manifold that
is employed at the station has multiple ports then placement
of the instrument lines can be crucial. If a manifold similar
to Figure 7.4 is used ambient air flows down the center tube
and then travels up on both sides of the manifold to the
analyzer ports. It is suggested that instruments requiring
lower flows be placed towards the bottom of the manifold.
The general rule of thumb states that the calibration line (if
used) placement should be in a location so that the
calibration gases flow past the instruments before the gas is
evacuated out of the manifold. Figure 7.4 illustrates two
potential introduction ports for the calibration gas. The port
at the elbow of the sampling cane provides more
information about the cleanliness of the sampling system.
7.3.2 Placement of Probes and Manifolds
Probes and manifolds must be placed to avoid introducing
bias to the sample. Important considerations are probe
height above the ground, probe length (for horizontal
probes), and physical influences near the probe.
Figure 7.4 Positions of calibration line in
sampling manifold
Pump
Analyzer
Calibrator
Gas
Excess Cal. Gas
Analyzer
Analyzer
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 8 of 16
Some general guidelines for probe and manifold placement are:
probes should not be placed next to air outlets such as exhaust fan openings
horizontal probes must extend beyond building overhangs
probes should not be near physical obstructions such as chimneys which can affect the air flow in
the vicinity of the probe
height of the probe above the ground depends on the pollutant being measured
Table 7-3 summarizes the probe and monitoring path siting criteria while Table 7-4 summarizes the
spacing of probes from roadways. This information can be found in 40 CFR Part 58, Appendix E
2
. For
PM
10
and PM
2.5
, Figure 7.5 provides the acceptable areas for micro, middle, neighborhood and urban
samplers, with the exception of microscale street canyon sites.
Table 7-3 Summary of Probe and Monitoring Path Siting Criteria
Pollutant Scale (maximum
monitoring path
length, meters)
Height from
ground to probe,
inlet or 80% of
monitoring path
1
(meters)
Horizontal and
vertical distance
from supporting
structures
2
to
probe, inlet or
90% of monitoring
path
1
(meters)
Distance from
trees to probe,
inlet or 90% of
monitoring
path
1
(meters)
Distance from
roadways to probe,
inlet or monitoring
path
1
(meters)
SO
2
3,4,5,6
Middle (300 m)
Neighborhood Urban,
and Regional (1 km).
215 >1 >10 N/A
CO
4,5,7
Micro, Middle (300
m), Neighborhood (1
km).
3 +12: 215 >1 >10 210; see Table 73 of
this section for middle
and neighborhood scales.
NO
2
, O
3
3,4,5
Middle (300 m)
Neighborhood, Urban,
and Regional (1 km).
215 >1 >10 See Table 7-3 of this
section for all scales.
Ozone
precursors
(for
PAMS
) 3,4,5.
Neighborhood and
Urban (1 km)
215 >1 >10
PM,Pb
3,4,5,6,8
Micro: Middle,
Neighborhood,
Urban and Regional.
27 (micro);
27 (middle PM10-2.5);
215 (all other scales).
>2 (all scales,
horizontal distance
only).
>10 (all scales).
210 (micro); see Figure
7.3 of this section for all
other scales
N/ANot applicable.
1
Monitoring path for open path analyzers is applicable only to middle or neighborhood scale CO monitoring and all applicable scales for
monitoring SO
2
,O
3
, O
3
precursors, and NO
2
.
2
When probe is located on a rooftop, this separation distance is in reference to walls, parapets, or penthouses located on roof.
3
Should be >20 meters fromthe dripline of tree(s) and must be 10 meters fromthe dripline when the tree(s) act as an obstruction.
4
Distance fromsampler, probe, or 90% of monitoring path to obstacle, such as a building, must be at least twice the height the obstacle protrudes
above the sampler, probe, or monitoring path. Sites not meeting this criterion may be classified as middle scale (see text).
5
Must have unrestricted airflow 270 degrees around the probe or sampler; 180 degrees if the probe is on the side of a building.
6
The probe, sampler, or monitoring path should be away fromminor sources, such as furnace or incineration flues. The separation distance is
dependent on the height of the minor sources emission point (such as a flue), the type of fuel or waste burned, and the quality of the fuel (sulfur,
ash, or lead content). This criterion is designed to avoid undue influences fromminor sources.
7
For microscale CO monitoring sites, the probe must be >10 meters froma street intersection and preferably at a midblock location.
8
Collocated monitors must be within 4 meters of each other and at least 2 meters apart for flow rates >200 liters/min and at least 1 meter for
flow rates <200 liters/min .
2
http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title40/40tab_02.tpl All references to CFR in following
sections can be found at this site.
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 9 of 16
Table 7-4 Minimum Separation Distance Between Roadways and Sampling Probes or Monitoring
Paths at Neighborhood and Urban Scales for O
3 ,
Oxides of Nitrogen (NO, NO
2
, NO
x
, NO
y
) and CO
Roadway ave. daily
traffic vehicles per
day
O
3
and Oxides of N
Neighborhood
& Urban
1
(meters)
O
3
and Oxides of N
Neighborhood.
& Urban
1& 2
(meters)
CO
Neighborhood
(meters)
<1,000 10 10
10,000 10 20
<10,000 10
15,000 20 30 25
20,000 30 40 45
30,000 80
40,000 50 60 115
50,000 135
>60,000 150
70,000 100 100
>110,000 250 250
Distance fromthe edge of the nearest traffic lane. The distance for intermediate traffic counts should be
interpolated fromthe table values based on the actual traffic count.
2
Applicable for ozone monitors whose placement has not already been approved as of December 18, 2006.
Figure 7.5 Acceptable areas for PM10 and PM2.5 micro, middle, neighborhood, and urban samplers except for
microscale street canyon sites.
Open Path Monitoring
To ensure that open path monitoring data are representative of the intended monitoring objective(s),
specific path siting criteria are needed. 40 CFR Part 58, Appendix E, contains specific location criteria
applicable to monitoring paths after the general station siting has been selected based on the monitoring
objectives, spatial scales of representativeness, and other considerations presented in Appendix D. The
0 20 40 60 80 100 120 140 160
0
20
40
60
80
100
Distance of PM10 and PM2.5 Samplers from Nearest Traffic Lane, (meters)
A
D
T
o
f
A
f
f
e
c
t
i
n
g
R
o
a
d
s
x
1
0
N
o
C
a
t
e
g
o
r
y
(
a
)
S
i
t
e
s
U
n
a
c
c
e
p
t
a
b
l
e
a
t
a
l
l
t
r
a
f
f
i
c
l
e
v
e
l
s
P
r
e
f
e
r
r
e
d
a
r
e
a
f
o
r
c
a
t
e
g
o
r
y
(
a
)
s
i
t
e
m
i
c
r
o
s
c
a
l
e
i
f
m
o
n
i
t
o
r
i
s
2
-
7
m
e
t
e
r
s
h
i
g
h
,
m
i
d
d
l
e
s
c
a
l
e
o
t
h
e
r
w
i
s
e
Middle Scale Suitable for
Category (a) site but not preferred
Neighborhood Scale Suitable
for category (b) Site
Urban Scale
3
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 10 of 16
open path siting requirements largely parallel the existing requirements for point analyzers, with the
revised provisions applicable to either a "probe" (for point analyzers), a "monitoring path" (for open path
analyzers), or both, as appropriate. Criteria for the monitoring path of an open path analyzer are given
for horizontal and vertical placement, spacing from minor sources, spacing from obstructions, spacing
from trees, and spacing from roadways. These criteria are summarized in Table 7-3.
Cumulative Interferences on a Monitoring Path: To control the sum effect on a path measurement
from all the possible interferences which exist around the path, the cumulative length or portion of a
monitoring path that is affected by obstructions, trees, or roadways must not exceed 10 percent of the total
monitoring path length. This limit for cumulative interferences on the monitoring path controls the total
amount of interference from minor sources, obstructions, roadways, and other factors that might unduly
influence the open path monitoring data.
Monitoring Path Length: For NO
2
, O
3
and SO
2
, the
monitoring path length must not exceed 1 kilometer
for analyzers in neighborhood, urban, or regional
scales, or 300 meters for middle scale monitoring sites.
These path limitations are necessary in order to
produce a path concentration representative of the
measurement scale and to limit the averaging of peak
concentration values. In addition, the selected path
length should be long enough to encompass plume
meander and expected plume width during periods
when high concentrations are expected. In areas
subject to frequent periods of rain, snow, fog, or dust,
a shortened monitoring path length should be
considered to minimize the loss of monitoring data due
to these temporary optical obstructions.
Mounting of Components and Optical Path
Alignment: Since movements or instability can
misalign the optical path, causing a loss of light and
less accurate measurements or poor readings, highly
stable optical platforms are critical. Steel buildings
and wooden platforms should be avoided as they tend
to move more than brick buildings when wind and
temperature conditions vary. Metal roofing will, for
example, expand when heated by the sun in the
summer. A concrete pillar with a wide base, placed upon a stable base material, has been found to work
well in field studies. A sketch of an optical platform is included in Figure 7.6. More information on open
path monitoring can be found in the document: EPA Handbook: Optical Remote Sensing for
Measurement and Monitoring of Emissions Flux
3
.
3
http://www.epa.gov/ttn/emc/guidlnd/gd-052.pdf
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 11 of 16
7.3.3 Probe, Tubing and Manifold Maintenance
Figure 7.7 Examples of contaminated tubing and manifolds needing more frequent maintenance
After an adequately designed sampling probe and/or manifold has been selected and installed, the
following steps will help in maintaining constant sampling conditions:
1. Conduct a leak test. For the conventional manifold, seal all ports and pump down to
approximately 1.25 cm water gauge vacuum, as indicated by a vacuum gauge or manometer
connected to one port. Isolate the system. The vacuum measurement should show no change at
the end of a 15-min period.
2. Establish cleaning techniques and a schedule. A large diameter manifold may be cleaned by
pulling a cloth on a string through it. Otherwise the manifold must be disassembled periodically
and cleaned with distilled water. Soap, alcohol, or other products that may contain hydrocarbons
should be avoided when cleaning the sampling train. These products may leave a residue that
may affect volatile organic measurements. Visible dirt should not be allowed to accumulate.
3. Plug the ports on the manifold when sampling lines are detached.
4. Maintain a flow rate in the manifold that is either 3 to 5 times the total sampling requirements or
at a rate equal the total sampling requirement plus 140 L/min. Either rate will help to reduce the
sample residence time in the manifold and ensure adequate gas flow to the monitoring
instruments.
5. Maintain the vacuum in the manifold <0.64 cm water gauge. Keeping the vacuum low will help
to prevent the development of leaks.
For monitoring organizations that use individual sampling lines instead of manifolds, one may want to
weigh the cost of cleaning lines versus replacing them.
In addition to the information presented above, the following should be considered when designing a
sampling manifold:
suspending strips of paper in front of the blower's exhaust to permit a visual check of blower
operation;
positioning air conditioner vents away from the manifold to reduce condensation of water vapor
in the manifold ;
positioning air conditioner vents away from analyzers;
positioning sample ports of the manifold toward the ceiling to reduce the potential for
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 12 of 16
accumulation of moisture in analyzer sampling lines, and using borosilicate glass, stainless steel,
or their equivalent for VOC sampling manifolds at PAMS sites to avoid adsorption and
desorption reactions of VOC's on FEP Teflon;
if moisture in the sample train poses a problem (moisture can absorb gases, namely NO
x
and
SO
2
), wrap the manifold and instrument lines with heat wrap, a product that has heating coils
within a cloth covering that allows the manifold to be maintained at a constant temperature that
does not increase the sampled air temperature by more than 3-5 degrees C above ambient
temperature;
ensuring the manifold has a moisture trap and that it is emptied often (water traps in sample lines
from the manifold to the instruments should be avoided) ; and
using water resistant particulate filters in-line with the instrument.
7.4 Reference/Equivalent Methods and Approved Regional Methods
For monitoring in a SLAMS network, either reference or equivalent methods are usually required. This
requirement, and any exceptions, are specified in 40 CFR Part 58, Appendix C. In addition, reference or
equivalent methods may be required for other monitoring applications, such as those associated with
prevention of significant deterioration (PSD). Requiring the use of reference or equivalent methods helps
to assure the reliability of air quality measurements including: ease of specification, guarantee of
minimum performance, better instruction manuals, flexibility of application, comparability with other
data and increased credibility of measurements. However, designation as a reference or equivalent
method provides no guarantee that a particular analyzer will always operate properly. 40 CFR Part 58,
Appendix A requires the monitoring organization to establish an internal QC program. Specific guidance
for a minimum QC program is described in Section 10 of this Handbook. The definitions and
specifications of reference and equivalent methods are given in 40 CFR Part 53. For most monitoring
applications, the distinction between reference and equivalent methods is unimportant and either may be
used interchangeably.
Reference and equivalent methods may be either manual or automated (analyzers). For particulates and
Pb, the reference method for each is a unique manual method that is completely specified in 40 CFR Part
50; all other approved methods for particulates and Pb qualify as equivalent methods. SO
2
, has a
reference method and a measurement principle. For CO, NO
2
, and O
3
, Part 50 provides only a
measurement principle and calibration procedure applicable to reference methods for these pollutants.
Automated methods (analyzers) for these pollutants may be designated as either reference methods or
equivalent methods, depending on whether the methods utilize the same measurement principle and
calibration procedure specified in Part 50. Because any analyzer that meets the requirements of the
specified measurement principle and calibration procedure may be designated as a reference method,
there are numerous reference methods for SO
2
,
CO, NO
2
, and O
3
. Further information on this subject is in
the preamble to 40 CFR Part 53.
Except for the unique reference methods for SO
2
, particulates, and Pb specified in 40 CFR Part 50, all
reference and equivalent methods must be officially designated as such by EPA under the provisions of
40 CFR Part 53. Notice of each designated method is published in the Federal Register at the time of
designation. A current list of all designated reference and equivalent methods is maintained and updated
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 13 of 16
by EPA whenever a new method is designated. This list can be found on AMTIC
4
. Moreover, any
analyzer offered for sale as a reference or equivalent method after April 16, 1976 must bear a label or
sticker indicating that the analyzer has been designated as a reference or equivalent method by EPA.
Sellers of designated automated methods must comply with the conditions as promulgated in 40 CFR Part
53.9. Monitoring organizations should be aware of the vendor condition. Accordingly, in selecting a
designated method for a particular monitoring application, consideration should be given to such aspects
as:
the suitability of the measurement principle;
the suitability for the weather and/or geographic conditions at the site;
analyzer sensitivity and available operating ranges suitable for the site;
susceptibility to interferences that may be present at the monitoring site;
requirements for support gases or other equipment;
reliability;
maintenance requirements;
initial as well as operating costs;
features such as internal or fully automatic zero and span checking or adjustment capability, etc.;
compatibility to your current and future network, i.e. software and connections (RS 232,
Ethernet); and
manual or automated methods.
The order for a new reference or equivalent analyzer should specify the EPA method designation.
The required performance specifications, terms of the warranty, time limits for delivery and acceptance
testing, and what happens in the event that the analyzer falls short of performance requirements should be
documented. Aside from occasional malfunctions, consistent or repeated noncompliance with any of
these conditions should be reported to EPA. In selecting designated methods, remember that designation
of a method indicates only that it meets certain minimum standards. Competitive differences still exist
among designated analyzers. Some analyzers or methods may have performance, operational, economic
or other advantages over others. A careful selection process based on the individual air monitoring
application and circumstances is very important.
Some of the performance tests and other criteria used to qualify a method for designation as a reference or
equivalent method are intended only as pass/fail tests to determine compliance with the minimum
standards. Test data may not allow quantitative comparison of one method with another.
FRM/FEM Designated Operating Ranges and the Affect of Span Checks
Although all FRM/FEMs are required to meet the range specified in Table 7-5
5
, many instruments are
designated for ranges narrower and or broader than the requirement. During the equipment
purchase/selection phase, monitoring organizations should select an instrument with ranges most
appropriate to the concentration at the site which the instrument will be established and then use the range
that is most appropriate for the monitoring situation. Earlier versions of this Handbook suggested that the
concentration of the span checks be 70 90% of the analyzers measurement range. Using this guidance
and the designated ranges of some of the FRM/FEM method being used, a span check might be selected
4
http://www.epa.gov/ttn/amtic/criteria.html
5
performance specifications can be found in 40 CFR Part 53.23 Table B-1
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 14 of 16
at a concentration that is never found in the ambient air at the site for which the monitor is operating. The
span check concentration should be selected that is more beneficial to the quality control of the routine
data at the site and EPA suggests: 1) the selection of an appropriate measurement range, and 2) selecting a
span that at a minimum is above 120% of the highest NAAQS (for sites used for designation purposes)
and above the 99% of the routine data over a 3 year period. The multi-point verification/calibrations that
are performed annually can be used to challenge the instrument and confirm linearity and calibration
slope of the selected operating range.
Table 7-5 Performance Specifications for Automated Methods
PM
2.5
Reference and Equivalent Methods
All formal sampler design and performance requirements and the operational requirements applicable to
reference methods for PM
2.5
are specified in 40 CFR Part 50, Appendix L. These requirements are quite
specific and include explicit design specifications for the type of sampler, the type of filter, the sample
flow rate, and the construction of the sample collecting components. However, various designs for the
flow-rate control system, the filter holder, the operator interface controls, and the exterior housing are
possible. Hence, various reference method samplers from different manufacturers may vary considerably
in appearance and operation. Also, a reference method may have a single filter capability (single) or a
multiple filter capability (sequential), provided no deviations are necessary in the design and construction
of the sample collection components specified in the reference method regulation. A PM
2.5
method is not
a reference method until it has been demonstrated to meet all the reference method regulatory
requirements and has been officially designated by EPA as a reference method for PM
2.5
.
Equivalent methods for PM
2.5
have a wider latitude in their design, configuration, and operating principle
than reference methods. These methods are not required to be based on filter collection of PM
2.5
;
therefore, continuous or semi-continuous analyzers and new types of PM
2.5
measurement technologies are
not precluded as possible equivalent methods. Equivalent methods are not necessarily required to meet all
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 15 of 16
the requirements specified for reference methods, but they must demonstrate both comparability to
reference method measurements and similar PM
2.5
measurement precision.
The requirements that some (but not all) candidate methods must meet to be designated by EPA as
equivalent methods are specified in 40 CFR Part 53. To minimize the difficulty of meeting equivalent
method designation requirements, three classes of equivalent methods have been established in the 40
CFR Part 53 regulations, based on a candidate methods extent of deviation from the reference method
requirements. All three classes of equivalent methods are acceptable for SLAMS or SLAMS-related
PM
2.5
monitoring, but not all types of equivalent methods may be equally suited to various PM
2.5
monitoring requirements or applications.
Class I equivalent methods are very similar to reference methods, with only minor deviations, and must
meet nearly all of the reference method specifications and requirements. The requirements for designation
as Class I equivalent methods are only slightly more extensive than the designation requirements for
reference methods. Also, because of their substantial similarity to reference methods, Class I equivalent
methods operate very much the same as reference methods.
Class II equivalent methods are filter-collection-based methods that differ more substantially from the
reference method requirements. The requirements for designation as Class II methods may be
considerably more extensive than for reference or Class I equivalent methods, depending on the specific
nature of the variance from the reference method requirements.
Class III equivalent methods cover any PM
2.5
methods that cannot qualify as reference or Class I or II
equivalent methods because of more profound differences from the reference method requirements. This
class encompasses PM
2.5
methods such as continuous or semi-continuous PM
2.5
analyzers and potential
new PM
2.5
measurement technologies. The requirements for designation as Class III methods are the most
extensive, and, because of the wide variety of PM
2.5
measurement principles that could be employed for
candidate Class III equivalent methods, the designation requirements are not explicitly provided in 40
CFR Part 53.
Approved Regional Methods (ARM)
There are some continuous PM
2.5
methods that currently may not be able to meet the national FRM and
FEM designation criteria. However, these methods may operate at acceptable levels of data quality in
certain regions of the country or under certain conditions. The EPA has expanded the use of alternative
PM
2.5
measurement methods through ARMs. A method for PM
2.5
that has not been designated as an FRM
or FEM as defined in 40 CFR Part 50.1 may be approved as an ARM. If a monitoring organization feels
that a particular method may be suitable for use in its network, it can apply for the method to be
designated as an ARM. The following provides a summary of the ARM requirements.
QA Handbook Volume II, Section 7.0
Revision No: 0
Date: 05/13
Page 16 of 16
PM
2.5
ARM Criteria Summary
1. Must meet Class III Equivalency Criteria
o Precision
o Correlation
o Additive and multiplicative bias
2. Tested at site(s) where it will be used
o 1 site in each MSA/CMSA up to the first 2 highest pop MSA/CMSA
o 1 site in rural area or Micropolitan Statistical Area
o Total of 3
If the ARM has been approved by another agency then:
o 1 site in MSA/CMSA and 1 site in rural area or Micropolitan Statistical Area
o Total of 2
3. 1 year of testing all seasons covered
o 90 valid sample pairs per site with at least 20 valid sample pairs per season.
o Values <3 ug/m
3
may be excluded in bias estimates but this does not affect completeness criteria.
4. Collocation to establish precision not required
o peer reviewed published literature or data in AQS that can be presented is enough
5. ARM must be operated on an hourly sampling frequency providing for aggregation into 24-hour average
measurements.
6. Must use approved inlet and separation devices (Part 50 Appendix L or FEM Part 53)
o Exception methods that by their inherent measurement principle may not need an inlet or
separation device.
7. Must be capable of providing for flow audits
o Exception methods that by their inherent measurement principle measured flow is not required.
8. Monitoring agency must develop and implement appropriate procedures for assessing and reporting
precision and bias.
Routine Monitoring Implementation
9. Collocation of ARM and FRM/FEM at 30% of SLAMS network or at least 1/network
o At 1 in 6 day sampling frequency
o Located at design value site among the largest MSA/CMSA
o Collocated FRM/FEM can be substituted for ARM if ARM is invalidated
10. Collocation ARM with ARM
o 7.5% of sites or at least 1 site
11. Bias assessment (PEP)
o Same frequency as Appendix A
ARM Approval
1. New ARM- EPA NERL, RTP, NC
2. ARM that has been approved by another agency- EPA Regional Administrator
QA Handbook Vol II , Section 8.0
Revision No: 0
Date: 05/13
Page 1 of 7
8.0 Sample Handling and Custody
A critical activity within any data collection phase involving physical samples is the handling of sample
media prior to sampling, handling/transporting sample media to the field, handling samples in the field at
the time of collection, storage of samples (in the field or other locations), transport of samples from the
field site, and the analysis of the samples. Documentation ensuring that proper handling has occurred
throughout these activities is part of the custody record. This documentation initially comes in the form
of written sample handling and custody procedure and then in the development, use and archiving of field
and laboratory notebooks and chain of custody forms.
Custody records document the chain of custody; the date and person responsible for the various sample
handling steps associated with each sample and the information that acknowledges that sample integrity
remained intact. Custody records also provide a reviewable trail for quality assurance purposes and can
be used as evidence in legal proceedings.
Prior to the start of an EDO, the various types of samples should be identified and the following questions
asked:
Does the sample need to be analyzed within a specified time period?
What modes of sample transport are necessary and how secure should they be?
What happens if a sample is collected on Friday? Is the sample shipped for a weekend delivery
or (weekday) or stored at the field office and what are the appropriate custody procedures?
Can the samples integrity be affected by outside influences (e.g. temperature, pressure, humidity,
jostling/dropping during shipment) and do these need to be monitored (e.g., max/min
thermometers, pressure sensors)?
How critical is it that sample integrity be known (e.g., is evidence tape necessary)?
How can it be documented that sample integrity was maintained from the collection to reporting?
What are the procedures when sample integrity is compromised (e.g., flag, dont analyze)?
These are some of the questions that should be answered and documented in the monitoring
organizations QAPP and chain of custody procedures.
This section specifically addresses the handling and custody of physical environmental samples (e.g.,
exposed filters for particulate matter or lead (PM or Pb) determinations and canisters containing whole air
samples) that are collected at a field location and transported to a laboratory for analysis. For specific
details of sample handling and custody (i.e., PAMS, NATTS, CSN etc) monitoring organization should
consult the appropriate technical assistance documents located in the National Programs summaries in
Appendix A.
In addition to physical samples, some types of field data collected in hard copy (e.g., strip charts, sampler
flow data, etc.) or electronic (e.g., data downloaded from a data logger with limited storage space) format
are irreplaceable and represent primary information about physical samples or on-site measurements that
are needed to report a final result. When such hard copy or electronic data are transported and/or change
custody, it is advised that the same chain of custody practices described in this section for physical
samples be employed to ensure that irreplaceable data can be tracked and are not altered or tampered
with.
QA Handbook Vol II , Section 8.0
Revision No: 0
Date: 05/13
Page 2 of 7
For additional information, an EPA on-line self-instructional course, Chain-of-Custody Procedures for
Samples and Data
1
is available for review. The National Enforcement Investigation Center
2
(NEIC) also
offers a course relevant to chain of custody issues.
Laboratory Information Management Systems
A laboratory information management system (LIMS) is a computer system used in the laboratory for the
management and tracking of samples, instruments, standards and other laboratory functions such as data
reductions, data transfer and reporting. The goal is to create an EDO where:
Instruments used are integrated in the lab network; receive instructions and worklists from the
LIMS and return finished results including raw data back to a central repository where the LIMS
can update relevant information to external systems (i.e., AIRNow or AQS).
Lab personnel will review/check calculations, documentation and results using online information
from connected instruments, reference databases and other resources using electronic lab
notebooks connected to the LIMS.
Management can supervise the lab process, react to bottlenecks in workflow and ensure
regulatory demands are met.
External participants can review results and print out analysis certificates and other
documentation (QA Reports, quality control charts, outlier reports etc.).
For monitoring programs that are fairly stable, such as criteria pollutant monitoring, development of a
LIMS system may be very cost effective and should be considered. There is an upfront cost in the
development of these systems but monitoring organizations that have devoted resources to their
development have seen pay offs in improved data quality, sample tracking and data reporting.
8.1 Sample Handling
In the Ambient Air Quality Monitoring Program, discrete samples from manual methods associated with
SLAMS, PAMS, NATTS, and other networks, are physically handled prior to analysis. One must pay
particular attention to the handling of filters for particulate matter and lead since it has been suggested
that the process of filter handling may be the largest source of measurement error (especially low-volume
methods). Due to the manner in which concentrations are determined, it is critical that samples are
handled as specified in SOPs. The various phases of sample handling that should be documented in a
QAPP and SOP include:
Sample preparation, labeling and identification;
sample collection;
transportation;
sample analysis; and
storage (at all stages of use) and archival
1
http://www.epa.gov/apti/coc/
2
http://www.epa.gov/oecaerth/training/neti/index.html
QA Handbook Vol II , Section 8.0
Revision No: 0
Date: 05/13
Page 3 of 7
8.1.1 Sample Preparation, Labeling and Identification
Sample containers should be cleaned and filters prepared (pre-weighing of filters) before being used to
collect samples. SOPs should indicate the proper care and handling of the containers/filters to ensure
their integrity. Proper lab documentation that tracks the disposition of containers/filters through
preparation is just as important as the documentation after sampling. Care must be taken to properly mark
all samples to ensure positive, unambiguous identification throughout the sample collection, handling,
and analysis procedures. Figure 8.1 shows a standardized identification sticker that may be used to label
physical samples. Additional information may be added as required, depending on the particular
monitoring program. The rules of evidence used in legal proceedings require that procedures for
identification of samples used in analyses form the basis for future evidence. An admission by the
laboratory analyst that he/she cannot be positive whether he/she analyzed sample No. 6 or sample No. 9,
for example, could destroy the validity of the entire test report. Any information that can be used to
assess sample integrity, such as the pressure of canisters or cooler temperature, should be recorded at the
time of sample collection. Canister pressure or cooler temperature can then be ascertained at another
stage in the analytical process to confirm sample integrity.
Positive identification also must be provided for any filters used in the program. If ink is used for
marking, it must be indelible and unaffected by the gases and temperatures to which it will be subjected.
Other methods of identification can be used (e.g., bar coding), if they provide a positive means of
identification and do not impair the capacity of the filter to function.
(Name of Sampling Organization)
Sample ID No: _________________________ Storage Conditions: _________________________
Sample Type:___________________________ Site Name:_________________________________
Date/Time Collected: _____________________ Site Address:_______________________________
Sampler:_______________________________
Figure 8.1 Example Sample Label.
8.1.2 Sample Collection
The sample collection phase includes transporting the sampling material (e.g., sample filters, canisters) to
the sampling site, setting up the samplers to run, and then collecting the samples for transport to the
laboratory. This section does not cover proper installation of sampling media in a sample/monitor which
is very important but is specific to individual sampler types and should be covered in detail in SOPs.
Custody procedures may start prior to sampling if there are specific timeframes when the sampling media
must be used (e.g., 30 day filter use for PM
2.5
filters). Therefore, custody forms may start from the
laboratory that prepared the sample media and care must be taken to review and ensure the sample media
is viable for use.
Sometimes the specific sample media (e.g., specific filter ID) has been identified to a particular sampler at
the office rather than at the sampling site. If the site operator is setting up a number of samplers at one
site or at a number of sites it is very important the sample media and the chain of custody data is carefully
checked to ensure: 1) the chain of custody matches the sample media ID, and 2) the sample media is used
at the correct site and in the correct sampler.
QA Handbook Vol II , Section 8.0
Revision No: 0
Date: 05/13
Page 4 of 7
To reduce the possibility of invalidating the results, all collected samples must be carefully removed from
the monitoring device, placed in labeled, nonreactive containers, and sealed. Use of tamper-evident
custody seals are suggested and may be required in certain cases. The sample label must adhere firmly to
the container to ensure that it cannot be accidentally removed. Custody seals on sample containers serve
two purposes: to prevent accidental opening of the sample container and to provide visual evidence
should the container be opened or tampered with. The best type of custody seal depends on the sample
container; often, a piece of tape placed across the seal and signed by the operating technician is sufficient;
for other containers, wire locks or tie wraps may be the best choice. In some cases, the opening of sample
containers by unauthorized personnel, such as Transportation Security Administration officers, cannot be
avoided. The proper use of custody seals minimizes the loss of samples and provides direct evidence
whether sample containers have been opened and possibly compromised. Samples whose integrity is
questioned should be qualified (flagged).
8.1.3 Sample Transportation
Samples should be delivered to the laboratory for analysis as soon as possible following sample
collection. It is recommended that this be done on the same day that the sample is taken from the
monitor. If this is impractical, all the samples should be placed in transport containers (e.g., carrying
case, cooler, shipping box, etc.) for protection from breakage, contamination, and loss and in an
appropriate controlled-temperature device (i.e., refrigerator or freezer) if the samples have specific
temperature requirements. Each transport container should have a unique identification, such as sampling
location, date, and transport container number (e.g., number 2 of 5) to avoid interchange and aid in
tracking the complete shipment. The number of the transport containers should be subsequently recorded
on the chain of custody (COC) form (described in Section 8.2) along with the sample identification
numbers of the samples included within each transport container. It is advised that the container be sealed
using an appropriate tamper-evident method, such as with custody tape or a wire lock.
In transporting samples, it is important that precautions be taken to eliminate the possibility of tampering,
accidental destruction, and/or physical and chemical action on the sample. The integrity of samples can
be affected by temperature extremes, air pressure (air transportation), and the physical handling of
samples (packing, jostling, etc.). These practical considerations must be dealt with on a site-by-site basis
and should be documented in the organizations QAPP and site specific SOPs.
The person who has custody of the samples must be able to testify that no tampering occurred. Security
must be continuous. If the samples are put in a vehicle, lock the vehicle. After delivery to the laboratory,
the samples must be kept in a secured place with restricted access.
8.1.4 Sample Analysis
SOPs, if properly developed, have detailed information on the handling of samples at the analysis phase.
Similar to the preparation step, if the sample undergoes a number of steps (preparation, equilibration,
extraction, dilution, analysis, etc.), and these steps are performed by different individuals, there should be
a mechanism in place to track the sample through the steps to ensure SOPs are followed and the integrity
of the sample was maintained. Laboratories should make extensive use of laboratory notebooks at the
various steps (stations) of the analytical process to record the sample handling process and maintain
sample integrity.
QA Handbook Vol II , Section 8.0
Revision No: 0
Date: 05/13
Page 5 of 7
8.1.5 Storage and Archival
Samples must be properly handled to ensure that there is no contamination and that the sample analyzed
is actually the sample taken under the conditions reported. For this reason, whenever samples are not
under the direct control of the sample custodian, they should be kept in a secured location. This may be a
locked vehicle, locked refrigerator, or locked laboratory with limited access. It is highly recommended
that all samples be secured until discarded. These security measures should be documented by a written
record signed by the handlers of the sample on the COC form or in a laboratory notebook, indicating the
storage location and conditions. Any samples not destroyed during the analysis process (e.g., exposed
filters for PM) should be archived as directed by the method requirements or applicable QAPP. 40 CFR
Part 58.16 requires PM
10
, PM
10-2.5
and PM
2.5
filters from SLAMS manual lo-volume samplers (samplers
having flow rates less than 200 liters/minute) be archived for 5 years from collection. However, it is
suggested that they be archived the first year in cold conditions (e.g., at 4 C) and at room temperature for
2 additional years. It is also suggested that non-destructive lead analysis and CSN samples follow this
guidance.
8.2 Chain of Custody (COC)
In order to use the results of a sampling program as evidence, a written record must be available listing
the location of the samples at all times. This is also an important component of good laboratory
practices
3
. The COC record is necessary to legally demonstrate that the integrity of samples have been
maintained. Without it, one cannot be sure that the samples and sampling data analyzed were the same as
the samples and data reported to have been taken at a particular time. Procedures may vary, but an actual
COC record sheet with the names and signatures of the relinquishers/receivers works well for tracking
physical samples. The samples should be handled only by persons associated in some way with the
monitoring program. A good general rule to follow is the fewer hands the better, even though a
properly sealed sample may pass through a number of hands without affecting its integrity.
Each person handling the samples must be able to state from whom and when the item was received and
to whom and when it was delivered. A COC form should be used to track the handling of the samples
through various stages of storage, processing, and analysis at the laboratory. It is recommended practice
to have each person who relinquishes or receives samples sign the COC form for the samples. An
example of a form that may be used to establish the COC for samples generated in the field is shown in
Figure 8.2. This form should accompany the samples at all times from the field to the laboratory. All
persons who handle the samples should sign the form. Figure 8.3 is an example of a laboratory COC
form. COC forms should be retained and archived as described in Section 5 (Documents and Records).
When using professional services to transport physical samples, only reliable services that provide a
tracking number should be used. Information describing the enclosed samples should be placed on the bill
of lading. A copy of the shipping receipt and tracking number should be kept as a record. The package
should be addressed to the specific person authorized to receive the package, although it is recognized
that staff not typically part of the COC may receive the samples and deliver them to the authorized
addressee. A procedure must be in place to ensure that samples are delivered to the appropriate person
without being opened or damaged. In this circumstance, the sample is considered still in transport until
received by the authorized addressee. It may be necessary to ship and/or receive samples outside of
3
http://www.epa.gov/oecaerth/monitoring/programs/fifra/glp.html
QA Handbook Vol II , Section 8.0
Revision No: 0
Date: 05/13
Page 6 of 7
normal business hours. A procedure should be developed in advance that considers staff availability,
secure storage locations, and appropriate storage conditions (e.g., temperature-controlled).
8.2.1 Sample Inspection and Acceptance
Once the samples arrive at their destination and at every custody change, the samples should first be
checked to ensure that their integrity is intact. The contents of the shipment should be checked against
the COC form to ensure that all samples listed were included in the shipment. If max/min thermometers
are used to monitor the temperature of the shipping containers this information should be recorded to
document that temperatures were adequately maintained. When using passivated stainless steel canisters,
the canister pressure, upon receipt, should be recorded and compared to the final sample collection
pressure to indicate canister leakage and sample loss. It is recommended that this comparison be made
using a certified gauge that is calibrated annually. Any samples whose integrity or identity may be
questionable should be brought to the attention of the person/persons that are in the custody chain and
flagged. All flags should be carried along with the samples until the validity of the samples can be
proven. This information can be included in the remark section of the COC form.
Chain of Custody Record
Project No. Project Title
Organization
Shipping
Container No. Contact
Field Samplers: print signature Address
Date Time Site/Location Sample Type Sample ID Remarks
Relinquished by (print and signature): Received by (print and signature): Comments
Figure 8.2 Example Field COC Form.
QA Handbook Vol II , Section 8.0
Revision No: 0
Date: 05/13
Page 7 of 7
Chain of Custody Record
Project No. Project Title Organization
Laboratory/Plant: _________________________________________________
Sample Number Number of
Container
Sample Description
Person responsible for samples Time: Date:
Sample Number Relinquished By: Received By: Time: Date: Reason for change in custody
Figure 8.3 Example Laboratory COC Form.
QA Handbook Vol II, Section 9.0
Revision No: 0
Date: 05/13
Page 1 of 3
9.0 Analytical Methods
The choice of methods used for any environmental data operation should be based upon the programs
data quality objectives (DQOs). Outputs from the DQO process can help determine acceptable
measurement uncertainty and assist in the selection of methods capable of meeting the data quality
acceptance limits. Methods are usually selected based upon their performance characteristics (precision,
bias, limits of detection), ease of use, and their reliability in field and laboratory conditions.
Since both field and analytical procedures have been developed for the criteria pollutants in the Ambient
Air Quality Monitoring Program, and in the various technical assistance documents for the other national
ambient air programs, this section will discuss the general concepts of standard operating procedures and
good laboratory practices as they relate to the reference and equivalent methods. A more detailed
discussion on the attributes of SOPs can be found in Section 5.
Many ambient air methods utilize continuous instruments and therefore do not involve laboratory
analysis. However particulate matter methods involve both continuous and manual methods and some of
the other major monitoring programs involve sampling which requires the use of laboratory analysis.
Table 9-1 provides a summary of the pollutants measured and the analytical methods for these programs.
For the SLAMS Network pollutants, the methods listed are considered the reference methods and are not
the only methods available for use. Federal equivalent methods are available and posted, once approved,
on AMTIC and are considered an acceptable alternative to the reference method. Information on
reference and equivalent methods can be found on the AMTIC website as well as the current list of
designated Federal Reference and Equivalent Methods
1
. CSN
2
and NATTS
3
SOPs are also on AMTIC.
Table 9-1 Acceptable Analytical Methods
Network Pollutant Acceptable Method Reference
SLAMS PM
10
Hi-Vol Gravimeteric 40 CFR Part 50 App B
SLAMS PM
10
- dichot Gravimeteric 40 CFR Part 50 App J
SLAMS PM
2.5
Gravimeteric 40 CFR Part 50 App L
SLAMS PM
10-2.5
Gravimeteric- difference 40 CFR Part 50 App O
SLAMS Pb from TSP Inductively Coupled Plasma /Mass Spectrometry
(ICP/MS)
4
40 CFR Part 50 App G
SLAMS Pb from PM10 Energy Dispersive X-Ray Fluorescence (EDXRF) 40 CFR Part 50 App Q
PAMS VOCs Gas Chromatography/Mass Spectrometry (GC/MS) TO-15
PAMS Carbonyl compounds High Performance Liquid Chromatography (HPLC) TO11-A
PAMS Non-methane organic
compounds
Cryogenic Preconcentration and Direct Flame Ionization
Detection (PDFID)
TO-12
NATTS Metals Inductively Coupled Plasma (ICP) IO 3.5
NATTS Aldehydes High Performance Liquid Chromatography TO11-A
NATTS VOCs Gas Chromatography/Mass Spectrometry (GC/MS) TO-15
CSN PM
2.5
Gravimeteric 40 CFR Part 50 App L
CSN Elements Energy Dispersive X-Ray Fluorescence (EDXRF) CSN QAPP and SOPs
CSN Anions Ion Chromatography CSN QAPP and SOPs
CSN Cations Ion Chromatography CSN QAPP and SOPs
CSN Organic, Elemental,
Carbonate, Total Carbon
Thermal Optical Reflectance (IMPROVE_A) CSN QAPP and SOPs
CSN Semi-volatile Organic
Compounds
Gas Chromatography/Mass Spectrometry (GC/MS) CSN QAPP and SOPs
1
http://www.epa.gov/ttn/amtic/criteria.html
2
http://www.epa.gov/ttn/amtic/specsop.html
3
http://www.epa.gov/ttn/amtic/airtox.html
4
As of the revision of this document, a new federal reference method for Pb by ICP-MS replaced the Atomic
Absorption (AA) method App G. The AA method will remain a federal equivalent method.
QA Handbook Vol II, Section 9.0
Revision No:0
Date: 05/13
Page 2 of 3
The SLAMS network provides more rigorous quality control requirements for the analytical methods.
These methods are found in 40 CFR Part 50, as described in the references. In addition, the method
identified for Pb is the reference method. There are a number of equivalent analytical methods that are
available for the Pb. Some of the NATTS methods are derived from the Toxics Organic Method
Compendium
5
. Others, like the CSN Network
6
may be developed specifically for the program, based on
the national laboratory currently performing the analysis. The PAMS, NATTS and CSN networks follow
the performance based measurement process paradigm. These Networks QA project plans or technical
assistance documents suggest a method, but also allow some flexibility to use other methods that meet the
networks measurement quality objectives. Various, independent proficiency test samples and technical
systems audits are performed to ensure that the data quality within these networks remain acceptable.
AQS Parameter and Method Codes--
Most monitoring information is reported to the Air Quality System (AQS). The pollutant measured is
called a parameter and the specific method used are designated as the method codes. AQS provides
a website that can assist in identifying the correct method code for data reporting
7
. Any approved
reference or equivalent method listed on the AMTIC website has a reference or equivalent method
number. An example of an approved reference sampler is the BGI sampler listed below. This sampler can
be used by the Parameter Code 88101 (PM
2.5
local conditions) and is associated with the method code
116. The method code is usually the last three digits of the designated reference (listed as RFPS) or
equivalent method (listed as EQPM).
BGI Inc. Models PQ200 or PQ200A PM2.5 Ambient Fine Particle Sampler
Manual Reference Method: RFPS-0498-116
BGI Incorporated Models PQ200 and PQ200A PM2.5 Ambient Fine Particle Sampler, operated with
firmware version 3.88 or 3.89R, for 24-hour continuous sample periods, in accordance with the Model
PQ200/PQ200A Instruction Manual and with the requirements and sample collection filters specified in 40
CFR Part 50, Appendix L, and with or without the optional Solar Power Supply or the optional dual-filter
cassette (P/N F-21/6) and associated lower impactor housing (P/N B2027), where the upper filter is used for
PM2.5. The Model PQ200A is described as a portable audit sampler and includes a set of three carrying cases.
Federal Register: Vol. 63, page 18911, 04/16/98
9.1 Good Laboratory Practices
Good laboratory practices (GLPs)
8
refer to general practices that relate to many, if not all, of the
measurements made in a laboratory. They are usually independent of the SOP and cover subjects such as
maintenance of facilities, records, sample management and handling, reagent control, and cleaning of
laboratory glassware. In many cases, the activities mentioned above may not be formally documented
because they are considered common knowledge. However, for consistency in laboratory technique, these
activities should have some form of documentation.
5
http://www.epa.gov/ttn/amtic/airtox.html
6
http://www.epa.gov/ttn/amtic/specsop.html
7
http://www.epa.gov/ttn/airs/airsaqs/manuals/codedescs.htm
8
http://www.epa.gov/Compliance/monitoring/programs/fifra/glp.html
QA Handbook Vol II, Section 9.0
Revision No:0
Date: 05/13
Page 3 of 3
9.2 Laboratory Activities
For ambient air samples to provide useful information or evidence, laboratory analyses must meet the
following four basic requirements:
1. Equipment must be frequently and properly calibrated and maintained (Section 12).
2. Personnel must be qualified to make the analysis (Section 4).
3. Analytical procedures must be in accordance with accepted practice (Section 9.1 above) properly
documented and received peer and management review.
4. Complete and accurate records must be kept (Section 5).
It is assumed that at some frequency the laboratory would be audited by an independent part of the
monitoring organization or external entity (e.g., EPA Regions) that would document that the basic
requirements were being met.
As indicated, these subjects are discussed in other sections of this document. For the Ambient Air
Quality Monitoring Program, laboratory activities are mainly focused on the pollutants associated with
manual measurements for lead, particulate matter (PM and CSN), NATTS
9
and PAMS
10
(VOCs).
However, many laboratories also prepare reference material, test or certify instruments, and perform other
activities necessary to collect and report measurement data. Each laboratory should define these critical
activities and ensure there are consistent methods for their implementation.
9
http://www.epa.gov/ttn/amtic/airtox.html
10
http://www.epa.gov/ttn/amtic/files/ambient/pams/newtad.pdf
QA Handbook Vol II, Section 10
Revision No: 0
Date: 05/13
Page 1 of 12
10.0 Quality Control
As described in Section 3, any data
collection process that provides an
estimate of a concentration contains two
types of uncertainty; population
(spatial/temporal variability) and
measurement uncertainty. DQOs define the data quality needed to make a correct decision an acceptable
percentage of the time.
Measurement quality objectives (MQOs) identify the quality control samples and the acceptance
criteria for those samples that will allow one to quantify the data quality indicators precision, bias,
representativeness, detection limit, completeness and comparability. The MQOs are designed to evaluate
and control various phases (sampling, preparation, analysis) of the measurement process to ensure that
total measurement uncertainty is within the range prescribed by the DQOs.
Data quality assessment (DQAs) is the scientific and statistical evaluation of environmental data
to determine if they meet the planning objectives of the project, and thus are of the right type,
quality, and quantity to support their intended use
1
. DQA is built on a fundamental premise: data
quality is meaningful only when it relates to the intended use of the data, which in many cases
stem from the DQOs. DQAs can be used to determine whether modifications to the DQOs are
necessary or tighter quality control is required.
10.1 The Quality Control Process
Within any phase or step of the data collection process, errors can occur. For example:
samples and filters can be mislabeled;
data can be transcribed or reported incorrectly or information management systems can be
programmed incorrectly;
calibration or check standards can be contaminated or certified incorrectly resulting in faulty
calibrations;
instruments can be set up improperly or over time fail to operate within specifications; and
SOPs may not be followed.
Quality Control (QC) is the overall system of technical activities that measures the attributes and
performance of a process, item, or service against defined standards to verify that they meet the stated
requirements established by the customer
2
. Quality control includes establishing specifications or
acceptance criteria for each quality characteristic of the monitoring/analytical process, assessing
procedures used in the monitoring/analytical process to determine conformance to these specifications,
and taking any necessary corrective actions to bring them into conformance. The EPAs QAPP guidance
1
Data Quality Assessment: Statistical Methods for the Practitioners http://www.epa.gov/quality/qs-docs/g9s-
final.pdf
2
American Nation Standard ANSI/ASQ E4-2004 http://www.asq.org/
QA Handbook Vol II, Section 10
Revision No:0
Date: 05/13
Page 2 of 12
document QA/G5
3
suggests that QC
activities are those
technical activities
routinely performed, not to
eliminate or minimize
errors, but to measure their
effect. The effect of an
error, such as lab
contamination, leading to
high PM
2.5
values might
lead to incorrectly
concluding a site was in
non-attainment. Although
there is agreement that the
measurement or
assessment of a QC check
does not itself eliminate errors, the QC data can and should be used to take appropriate corrective actions
which can minimize error or control data to an acceptable level of quality in the future. So, QC is both
proactive and corrective. It establishes techniques to determine if field and lab procedures are producing
acceptable data and identifies actions to correct unacceptable performance.
The goal of quality control is to provide a reasonable level of checking at various stages of the data
collection process to ensure that data quality is maintained and if it is found that the quality has not been
maintained, that it is discovered with a minimal loss of data (invalidation). Figure 10.1 provides an
example of some of the QC samples used in the PM
2.5
data collection process. The figure also identifies
what sources of error are associated with the QC sample. So, in developing a quality control strategy, one
must weigh the costs associated with quality control against the risks of data loss.
With the objective to minimize data loss, quality control data are most beneficial when they are assessed
as soon as they are collected. Therefore, information management systems can play a very important role
in reviewing QC data and flagging or identifying spurious data for further review. These information
management procedures can help the technical staff review the QC checks coming from a number of
monitoring sites in a consistent and time efficient manner. There are many graphical techniques (e.g.,
control charts and outlier checks) that can be employed to quickly identify suspect data. More details of
information management systems are discussed later in this section. It is the responsibility of the
monitoring organization, through the development of its QAPP, policies and procedures, to develop and
document the:
QC techniques;
frequency of the QC checks and the point in the measurement process that the check is
introduced;
traceability of QC standards;
matrix of the check sample;
appropriate test concentrations;
actions to be taken in the event that a QC check identifies a failed or changed measurement
3
http://www.epa.gov/quality/qa_docs.html
Field
Blank
Laboratory
Pre- Field Weighing
Field
Sampling
Laboratory
Post-Field Weighing
Lab
Blank
QC
Checks
Field
Blank
Field
Blank
Routine
Sample
Collocated
Sample
Routine
Sample
Collocated
Sample
Routine
Sample
Collocated
Sample
QC
Checks
PEP
PEP
PEP
Lab
Blank
QC
Checks
Meas. System
Contamination
Instrument
precision/bias
Meas. System
Precision
Meas. System
Bias
Lab
Contamination
Weighing lab
Precision/Bias
Figure 10.1 QC samples for PM
2.5
placed at various stages of measurement process
QA Handbook Vol II, Section 10
Revision No:0
Date: 05/13
Page 3 of 12
system;
formulae for estimating data quality indicators;
QC results, including control charts; and
the means by which the QC data will be used to determine that the measurement performance is
acceptable.
10.2 QC Activity Areas
For air monitoring projects the following three areas must have established QC activities, procedures and
criteria:
1. Data Collection.
2. Data management and the verification and validation process.
3. Reference materials.
Data collection includes any process involved in acquiring a concentration or value, including but not
limited to: sample preparation, field sampling, sample transportation, field analytical (continuous)
methods, and laboratory preparation/analytical processes. Depending on the importance of the data and
resources available, monitoring programs can implement QC samples, as illustrated in Figure 10.1, to
identify the errors occurring at various phases of monitoring process. Many of the QC samples can
identify errors from more than one phase. Table 10-1 provides a list of the majority of the QC samples
utilized in the ambient air program and include both their primary (double check ) and secondary uses
(single ) in error identification. Many of these checks are required in CFR; others are strongly suggested
in the method guidance. The MQO/validation templates provided in Appendix D provide the minimum
requirements for the frequency that these checks be implemented but many monitoring organization
choose more frequent checking in order to reduce the risk of data invalidation. A good example of this
increased effort is the zero/span and one-point precision checks for the gaseous criteria pollutants.
Although CFR requires the check to be performed once every two weeks, due to the advent of more
sophisticated automated monitoring systems, many monitoring organization perform these checks every
24-hours (11:45 PM 12:15 AM). In addition, once the QC checks are developed for a particular
monitoring method, it is important to identify the acceptance criteria and what corrective action will be
taken once a QC check fails. The MQO/Validation template in Appendix D can be used to list the QC
samples with a column added to include corrective action. Table 10-2 provides an example of a QC
Sample Table for PM
2.5
. Although the validation templates provide guidance for when data should be
invalidated, it is up to the monitoring organization to provide the specific corrective actions for the failure
of a specific QC check and therefore, Table 10-2 does not identify specific corrective actions.
Data management quality control is discussed in more detail in Section 14 and the
verification/validation process in Section 17. Automated verification/validation processes require some
frequency of checking to ensure that they are performed correctly since errors in programming can cause
persistent errors for long periods of time. At times new versions of software can cause programs that
worked properly in the past to falter. Providing QC checks (e.g., entering a data set that has errors that
the programs are expected to identify) to software to ensure they operate properly is strongly suggested.
Reference materials are the standards by which many of the QC checks are performed. Reference
material can be gaseous standards as well as devices (e.g., flow rate standards). If these standards are not
checked and verified as to their certified values, then the quality of data becomes suspect. Reference
materials need to be certified and recertified at acceptable frequencies in order to maintain the integrity of
QA Handbook Vol II, Section 10
Revision No:0
Date: 05/13
Page 4 of 12
the reference material. It is suggested that standards be certified annually. More discussion on standards is
included in Section 12.
Other elements of an organizations QAPP that may contain related sampling and analytical QC
requirements include:
Sampling Design which identifies the planned field QC samples as well as procedures for
QC sample preparation and handling;
Sampling Method Requirements which includes following the QC requirements of the
reference methods found in CFR Part 50 and for determining if the collected samples
accurately represent the population of interest (representativeness);
Sample Handling and Custody Requirements which discusses any QC devices employed
to ensure samples are not tampered with (e.g., custody seals) or subjected to other
unacceptable conditions during transport;
Analytical Methods Requirements which includes information on the subsampling methods
and information on the preparation of QC samples (e.g., blanks and replicates); and
Instrument Calibration and Frequency which defines prescribed criteria for triggering
recalibration (e.g., failed calibration checks).
10.3 Internal vs. External Quality Control
Quality control can be separated into 2 major categories: internal QC and external QC. Both types of
quality control are important in a well implemented quality system.
Internal
Most of the quality control activities take place internally; meaning the monitoring organization
responsible for collecting the data develops and implements the quality control activities, evaluates
the data, and takes corrective action when necessary. The internal activities can be used to take
immediate action if data appear to be out of acceptance.
External QC
External quality control can be implemented as an audit with external/independent devices or through
the submission of samples of two types: double-blind meaning the QC sample is not known (looks
like a routine sample) and therefore its concentration in unknown, or single-blind meaning they are
known to be a QC sample but its concentration is unknown to the person or organization performing
the measurement. These samples are also called performance evaluation or proficiency test samples
and are explained in Section 15. External QC may identify errors occurring in internal QC activities.
For example, an external flow rate audit may identify an internal flow rate verification standard that is
out of calibration. Because these checks are performed by external organizations, the results are not
always immediately available and therefore have a diminished capacity to control data quality in
real-time. However they are useful as an objective test of the internal QC procedures and may
identify errors (i.e., biased or contaminated standards) that might go unnoticed in an internal QC
system.
QA Handbook Vol II, Section 10
Revision No:0
Date: 05/13
Page 5 of 12
QA Handbook Vol II, Section 10
Revision No:0
Date: 05/13
Page 6 of 12
Table 10-2 PM
2.5
Field and Lab QC Checks
Requirement Frequency Acceptance Criteria Corrective Action
Field QC Checks
Calibration Standard
Recertifications
Flow Rate Transfer Std.
Field Thermometer
Field Barometer
1/yr
1/yr
1/yr
+2% of NIST-traceable Std.
+ 0.1
o
C resolution
+ 0.5
o
C accuracy
+ 1 mm Hg resolution
+ 5 mm Hg accuracy
Verification/ Calibration
Flow Rate (FR) Calibration
FR multi-point verification
One point FR verification
External Leak Check
Internal Leak Check
Temperature Calibration
Temp multi-point verification
One- point temp Verification
Pressure Calibration
Pressure Verification
Clock/timer Verification
If multi-point failure
1/yr
1/mo
every 5 sampling events
every 5 sampling events
If multi-point failure
on installation, then 1/yr
1/mo
on installation, then 1/yr
1/mo
1/mo
+ 2% of transfer standard
+ 2% of transfer standard
+ 4% of transfer standard
80 mL/min
80 mL/min
+ 2% of standard
+ 2
Cof standard
+ 4
C of standard
+10 mm Hg
+10 mm Hg
1 min/mo
Blanks
Field Blanks
See 2.12 reference
+30 g
Precision Checks
Collocated samples
every 12 days
CV < 10%
Audits (external assessments)
FRM PEP
Flow rate audit
External Leak Check
Internal Leak Check
Temperature Audit
Pressure Audit
5 or 8 sites/year
1/6mo
1/6mo
1/6mo
1/year
1/ year
+ 10%
+ 4% of audit standard
< 80 mL/min
< 80 mL/min
+ 2
C
+ 10 mm Hg
Laboratory QC Checks
Blanks
Lot Blanks
Exposure lot blanks
Lab Blanks
9-lot
3 per lot
10% or 1 per weighing
session
+15 g difference
+15 g difference
+15 g difference
Verification/ Calibration
Balance Calibration
Lab Temp. Calibration
Lab Humidity Calibration
1/yr
1/6mo
1/6mo
Manufacturers spec.
+ 2
C
+ 2%
Bias
Balance Audit
Balance Check
1/year
beginning, every 10th
samples, end
+15 g for unexposed filters
< +3 g
Calibration standards
Working Mass Stds.
Primary Mass Stds.
3-6 mo.
1/yr
25 g
25 g
Precision
Duplicate filter weighings
1 per weighing session
+15 g difference
QA Handbook Vol II, Section 10
Revision No:0
Date: 05/13
Page 7 of 12
10.4 CFR Related Quality Control Samples
40 CFR Part 58, Appendix A identifies a number of quality control samples that must be implemented for
the SLAMS (and NCore) SPM and PSD networks. By 2009, any special purpose monitors that use FRMs
or FEMs will be required to follow these requirements unless granted a waiver by the Regional
Administrator (or delegate). Table 10-3 provides a summary of the QC checks for the criteria pollutants
and the CFR reference where an explanation of each check is described. The reader should distinguish the
requirements that are related to automated and manual methods since there are some differences.
Table 10-3 Ambient Air Monitoring Measurement Quality Samples
Method CFR Reference Coverage (annual) Minimum frequency MQOs*
Automated Methods
One-Point QC:
for SO
2
, NO
2
, O
3
, CO
Section 3.2.1
Each analyzer
Once per 2 weeks
O
3
Precision 7%, Bias + 7%.
SO
2
, NO
2
, CO
Precision 10% , Bias + 10%
Annual performance
evaluation
for SO
2
, NO
2
, O
3
, CO
Section 3.2.2
Each analyzer
Once per year
See validation templates App
D.
Flow rate verification
PM
10
,PM
2.5
, PM
10-2.5
, Section 3.2.3 Each sampler Once every month
< 4% of standard and 5% of
design value
Semi-annual flow rate
audit
PM
10
, PM
2.5
, PM
10-2.5,
Section 3.2.4
Each sampler Once every 6 months
< 4% of standard and 5% of
design value
Collocated sampling
PM
2.5
, PM
10-2.5,
Section 3.2.5
15% within PQAO
Every twelve days
PM
2.5
, - 10% precision
PM
10-2.5-
- 15% precision
TSP 10% precision
Performance evaluation
program
PM
2.5
,PM
10-2.5
Section 3.2.7
1. 5 valid audits for primary
QA orgs, with < 5 sites
2. 8 valid audits for primary
QA orgs, with > 5 sites
3. All samplers in 6 years
over all 4 quarters
PM
2.5
, - + 10% bias
PM
10-2.5-
- +15% bias
Manual Methods
Collocated sampling
PM
10
, , PM
10-2.5,
PM
2.5
Pb-TSP, Pb-P
10
3.3.1 and 3.3.5 15% within PQAO
Every 12 days
PSD every 6 days
PM
10
, TSP, PM
2.5
, - 10%
precision
PM
10-2.5-
- 15% precision
Flow rate verification
PM
10
(low Vol),PM
10-2.5
,
PM
2.5,
, Pb-PM
10
3.3.2 Each sampler Once every month
< 4% of standard and 5% of
design value
Flow rate verification
PM
10
(High-Vol), Pb-TSP
3.3.2 Each sampler Once every quarter
Varies by instrument type see
validation templates
Semi-annual flow rate
audit
PM
10
(low Vol), PM
10-2.5
,
PM
2.5
,
3.3.3 Each sampler, all locations
Once every 6 months
< 4% of standard and 5% of
design value
Semi-annual flow rate
audit
PM
10
(High-Vol), Pb-TSP
3.3.3
Each sampler, all locations Once every 6 months
Varies by instrument type see
validation templates
Pb Analysis Audits
Pb-TSP, Pb-PM
10
3.3.4
1. Each sampler
2. Analytical (lead strips)
1. Include with TSP
2. Each quarter
1. Same as for TSP.
2. - + 10% bias
Performance evaluation
program
PM
2.5,
PM
10-2.5
3.3.7 and 3.3.8
1. 5 valid audits for primary
QA orgs, with < 5 sites
2. 8 valid audits for primary
QA orgs, with > 5 sites
3. All samplers in 6 years
Over all 4 quarters
PM
2.5
, + 10% bias
PM
10-2.5-,
+15% bias
* Some of the MQOs are found in CFR and others in Appendix D of this guidance document.
QA Handbook Vol II, Section 10
Revision No: 0
Date: 05/13
Page 8 of 12
Blanks and Blank Correction
The objective for collecting blanks at various phases of sample collection is to determine whether
contamination is occurring at that phase, be it in the field, during sample transport or at the analytical
laboratory, and to try to reduce this contamination if it is greater than acceptance limits. Some level of
contamination is acceptable and values below the acceptance limits do not require corrective action or
investigation. Values above this level should be investigated in order to reduce this contamination to
acceptable levels. EPA does not endorse blank correction of data unless it is already an accepted practice
in a monitoring program/method. In rare cases there may be a laboratory or measurement phase that has a
measurable, consistent and documented level of contamination that cannot be eliminated and blank
correction may be contemplated to adjust the data for this contamination. In this case the agency should
contact the EPA Region for advice.
Operating Ranges, Calibration Scale, Zero, Span, 1-point QC Checks and Performance
Evaluations
Due to successes over the years in reducing pollution, ambient air monitoring concentrations are steadily
decreasing. Many monitoring organizations are now purchasing trace gas monitors not only for NCore
sites but also for the routine monitoring sites. The ambient air QA regulations have kept up with this trend
by lowering and expanding the performance evaluation audit levels and suggesting that the audit levels
chosen should represent or bracket 80 percent of ambient concentrations measured by the analyzer
being evaluated . The regulation also suggest the one-point QA checks for the gasses should be related
to the routine concentrations normally measured at sites within the monitoring network in order to
appropriately reflect the precision and bias at these routine concentration ranges. The intent of the
regulatory language is to perform and report quality control data at concentrations more reflective of the
routine concentrations.
When the ambient air QA regulations and guidance were initially promulgated routine concentrations
were higher, there were different reference methods, different and less sensitive monitoring instruments
and calibration technologies and a different quality of gas standards. All of the technological change has
been for the better and should allow for better precision and bias at lower concentration ranges. In
addition, older guidance may have suggested that monitors had to be operated and calibrated at one of the
ranges for which they were approved. Current guidance suggests the following for each of the QC checks
for gaseous pollutants:
Operating Range- This term should be used for the ranges that are promulgated in the approved federal
reference method (FRM) or federal equivalent method (FEM) designation. Some instruments have been
designated for more than one operating range and one range may need to be selected for operating the
instrument. This range needs to be acknowledged when determining calibration concentration but only to
the extent that one would not operate within one operating range and calibrate with points higher than the
selected operating range.
QA Handbook Vol II, Section 10
Revision No: 0
Date: 05/13
Page 9 of 12
Calibration Scale The term should
be used to indicate the concentration
range that the instrument is calibrated
over. EPA feels that monitoring
organization should have more
flexibility in deciding their calibration
scale and although it needs to be
within the selected operating range it
does not necessarily need to be
performed at concentration levels not
normally measured by the monitor.
Figure 10.2 provides an example of
some calibrations performed in the
past where the 4 calibration points
(plus zero point) were spread evenly
across the operating range starting at
80% of the operating range. As
indicated, the routine data for this site
is clustered around the lowest calibration point. It is suggested that monitoring organizations select a
calibration scale that provides more calibration points at the lower concentrations to establish a better test
of linearity at the routine concentration ranges.
Zero Point- the bi-weekly zero point is fairly well defined and a straight forward procedure for using
zero air generators or standards to measure a zero point. Some air monitoring analyzers are capable of
periodically carrying out automatic zero and span calibrations and making their own zero and span self
adjustments to predetermined readings. EPA discourages the use of automatic span adjustments but
considers automatic zero adjustments reasonable when 1) the automatic zero standards pass through the
sample inlet and sample conditioning system, 2) the zero point/adjustment is performed daily, and 3) both
the adjusted and unadjusted zero response readings can be obtained from the data recording device.
Previously collected routine data should not be corrected based upon zero or span values.
Span Point-the bi-weekly span points have been traditionally set at 80-90% of the operating range, as
indicated in Figure 10-2. The span check concentration should be selected that is more beneficial to the
quality control of the routine data at the site and EPA suggests: 1) the selection of an appropriate
calibration scale (as described above) and, 2) selecting a span that at a minimum is above 120% of the
highest NAAQS (for sites used for designation purposes) and above the 99% of the routine data over a 3
year period.
One-Point QC The bi-weekly one point QC check is required to be reported within the range of 0.01-
0.10 ppm for O
3
, SO
2
and NO
2
and 1 10 ppm for CO and the concentration selected should be related to
the routine concentrations normally measured at sites within the monitoring network in order to
appropriately reflect the precision and bias at the routine concentration ranges.
Annual Performance Evaluations (PE)- A November 10, 2010 Technical Memorandum
4
expanded the
list of annual PE audit levels from 5 to 10 and revised the selection process so that one did not have to
4
Expanded List of Audit Levels for Annual Performance Evaluation for SO
2
, NO
2
, O
3
, and CO as Described in 40
CFR Part 58 Appendix A Section 3.2.2 http://www.epa.gov/ttn/amtic/cpreldoc.html
QA Handbook Vol II, Section 10
Revision No: 0
Date: 05/13
Page 10 of 12
select three consecutive levels. This language will be updated in the next revision of 40 CFR Part 58
Appendix A. The audit levels chosen should represent or bracket 80 percent of ambient concentrations
measured by the analyzer being evaluated. Due to the audit levels being expanded to allow for lower
concentration audits to support NCore and trace level work, a February 11, 2011 Technical Memo
5
was
posted on AMTIC that EPA suggests the use of the following acceptance criteria for levels 1 and
2 audit ranges:
For O
3
, SO
2
, and NO
2
:
+ 1.5 ppb difference or + 15 percent difference, whichever is
greater.
For CO: + 0.03 ppm difference or + 15 percent difference, whichever is greater.
For audit levels 3-10, the 15 percent difference acceptance criteria, currently in guidance, is
acceptable.
Selecting Appropriate Concentration Ranges for Gaseous QC Samples
The regulations attempt to provide some flexibility on how monitoring organizations choose the QC
concentration ranges. The following scenario is an acceptable approach to selecting the QC
concentrations. It uses ozone data from a typical routine monitoring site. Figure 10.3 illustrates this
process.
1) Take 3-years of 8-hour or 1-hour max values (101 ppb is highest 8-hour max for this example)
2) Multiply the highest 8-hour or 1-hour max
by 1.5, to establish the calibration scale
(150 ppb)
a) If calculation in step 2 is below NAAQS,
use 1.5x of the NAAQS (if sites are used
for regulatory purposes)
3) Take 80% of calibration scale (120 ppb) to
establish the span check value. The span
check can now serve as a bi-weekly check
to protect the NAAQS
4) Use the current CFR requirements to
select 1-point QC checks. Since the
current 1-point QC check range is 10-100
ppb for O
3
and the mean 8-hour max is
around 50 ppb, 50 ppb would be an
adequate concentration for this site.
5) This information can be used to select the
annual PE audit levels. Since the audit
levels should reflect 80% of the routine
data, an 80% box could be created to
select 80% of the routine data. The 80%
5
Guidance on Statistics for Use at Audit Levels 1 and 2 of the Expanded List of Audit Levels for Annual
Performance Evaluation for SO
2
, NO
2
, O
3
, and CO as Described in 40 CFR Part 58 Appendix A Section 3.2.2
http://www.epa.gov/ttn/amtic/files/ambient/pm25/datamang/20110217lowlevelstatmemo.pdf
QA Handbook Vol II, Section 10
Revision No: 0
Date: 05/13
Page 11 of 12
box can slide in both the left and the right direction. In the case of Figure 10.3, the box represents
the upper 80% of the routine data. Therefore PE audit levels 3, 4, 5 and 6 would be the most
appropriate to select for this site.
The approached described above is an example that allows for flexibility depending on the sites and the
concentrations measured within a monitoring network. This approach can be used for individual sites
(where there is greater variability in concentrations across the network) or it can be used for an aggregate
of sites within a PQAO (where less variability in concentrations exist). The approach can be used with
one year of data or it can be used with multiple years of data. Two issues should dictate the approach
used:
Ensure that the calibration scale exceeds the range of real and possible routine concentrations and
is above any primary and secondary NAAQS.
Ensure the span check is protective of the NAAQS.
The monitoring organizations QAPP would document the approach used.
10.5 Use of Computers for Quality Control
With the wide range of computers now available, and the advancements in data acquisition system (DAS)
technologies, consideration should be given to a computer system that can process and output the
information in a timely fashion. Such a computer system should be able to:
compute calibration
equations
compute measures of
linearity of calibrations (e.g.,
standard error or correlation
coefficient)
plot calibration curves
compute zero/span drift
results
plot zero/span drift data
compute precision and bias
results
compute control chart limits
plot control charts
automatically flag out-of-
control results
maintain and retrieve calibration and performance records
format data for reporting to AQS .
Some of these checks (e.g., calibrations) only need to be reviewed as needed or when the actual check is
performed. Other checks, like zero/span/one point QC checks or programmed routine data range or
outlier checks that may occur every day are much more easily performed automatically by properly
programmed computer systems. Earlier versions of this Handbook provided examples of quality control
QA Handbook Vol II, Section 10
Revision No:0
Date: 05/13
Page 12 of 12
charts for zero and span drifts but with the advanced data acquisition system technologies available, the
development of these charts is fairly straight forward. Figure 10.4, represents daily CO span checks over a
3 month period. This control chart can be downloaded from the American Society for Quality (ASQ)
web site
6
.
Many vendors offering newer generation data loggers and ambient air information management systems
provide programming of some of the QC checking capabilities listed above. EPA has also provided
guidance and a Data Assessment Statistical Calculator
7
(DASC) tool for the precision and bias
calculations of the quality control checks required in CFR Part 58, Appendix A. In addition, the AMP
255 Report in AQS also provides these statistics for many of the QC samples described in Table 10-3 but
use of the 255 Report requires data reporting to AQS which does not usually occur in time frames needed
for quality control.
6
http://asq.org/learn-about-quality/data-collection-analysis-tools/overview/control-chart.html
7
DASC tool on AMTIC at http://www.epa.gov/ttn/amtic/qareport.html
QA Handbook Vol II, Section 11.0
Revision No: 0
Date: 05/13
Page 1 of 7
11.0 Instrument Equipment Testing, Inspection and Maintenance
Implementing an ambient air monitoring network, with the various types of equipment needed, is no easy
task. Through appropriate testing, inspection and maintenance programs, monitoring organizations can
be assured that equipment is capable of operating at acceptable performance levels. Every piece of
equipment has an expected life span, and its use should be discontinued if its performance quality ceases
to meet appropriate standards. For amortization purposes, EPA estimates a 7 year lifespan for most
monitoring instruments and a somewhat longer lifespan for more permanent types of equipment
(instrument racks, monitoring shelters etc.). This schedule means that funds for replacing capital
equipment are provided in resource allocations and monitoring organizations should make the best use of
equipment replacement resources. Monitoring organizations may be able to prolong the life of equipment
but in doing so they may run the risk of additional downtime, more upkeep and a greater chance of data
invalidation, while losing out on newer technologies, better sensitivity/stability and the opportunities for
better information management technologies.
Due to the many types of equipment that can be used in an ambient air monitoring program, this section
provides general guidance on testing, inspection and maintenance procedures for broad categories of
equipment only. In most cases, equipment manufacturers include inspection and maintenance
information in the operating manuals. The role of monitoring organizations, in developing a quality
system, is to address the scheduling and documentation of routine testing, inspection, and maintenance.
Detailed maintenance documents should be available for each monitoring site. Elements incorporated
into testing, inspection and maintenance documents include:
equipment lists - by organization and station;
spare equipment/parts lists - by equipment, including suppliers;
inspection/maintenance frequency - by equipment;
testing frequency and source of the test concentrations or equipment;
equipment replacement schedules;
sources of repair - by equipment;
service agreements that are in place; and
monthly check sheets and entry forms for documenting testing, inspections and maintenance
performed.
11.1 Instrumentation
11.1.1 Analyzers and Samplers
Aside from the specific exceptions described in Appendix C of Part 58
1
, monitoring methods used for
SLAMS monitoring must be a reference or equivalent method, designated as such by 40 CFR Part 53
2
and
will be labeled as such
3
. Reference or equivalent methods also must be used at NCore monitoring sites
intended for comparison with any NAAQS. Among reference and equivalent methods, a variety of
analyzer designs and features are available. For certain pollutants, analyzers employing different
measurement principles are available. Some analyzer models only meet the minimum performance
specifications (see Table 7-6), while others provide a higher level of performance. Section 7 provides
information on what aspects to consider when selecting a particular monitoring instrument/analyzer.
1
40 CFR Part 58, Appendix C http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title40/40tab_02.tpl
2
40 CFR Part 53
3
40 CFR Part Part 53.9(d)
QA Handbook Vol II, Section 11.0
Revision No: 0
Date: 05/13
Page 2 of 7
Upon receiving the new analyzer, the user should carefully read the instructions or operating manual
provided by the manufacturer. Information or instructions concerning the following should be found in
the manufacturers manual:
unpacking and verifying that all component parts were delivered;
checking for damage during shipment;
checking for loose fittings and electrical connections;
assembling the analyzer;
installing the analyzer;
calibrating the analyzer;
operating the analyzer;
electrical and plumbing diagrams;
preventive maintenance schedule and procedures;
troubleshooting; and
a list of expendable parts.
NOTE: Many vendors have specific time periods when the initial checks for damage in transit
need to be made so it may be important to perform an initial check/verification of the
equipment as soon as possible. The monitor should be assembled and set up according to the
instructions in the manufacturers manual.
Initial Set-up and Acceptance Testing
It may be important to do this initial set-up and testing at the main office or laboratory facility (see
Section 11.1.3) before taking the equipment to the site. Following analyzer set-up, and allowance for the
instrument to reach required operating conditions, an initial verification of performance characteristics
such as power flow, noise, response time and a multi-point verification should be performed to determine
if the analyzer is operating properly. These guidelines assume that the instrument was previously
calibrated. If the instrument was disassembled after calibration, or no calibration of the instrument had
previously been performed, the monitor must have a multi-point verification/calibration to ensure it is
within acceptable calibration requirements prior to use. Zero/span drift and precision should be checked
during the initial calibration or measured using abbreviated forms of the test procedures provided in 40
CFR Part 53. Acceptance of the analyzer should be based on results from these performance tests. If the
analyzer does not perform to stated specifications, document the testing procedures and data and contact
the manufacturer for corrective action. Once accepted, reference and equivalent analyzers are guaranteed
by the manufacturer to operate within the required performance specifications for one year
4
, unless major
repairs are performed or parts are replaced. In such instances, the analyzers must be recalibrated before
use.
11.1.2 Support Instrumentation
Experiences of monitoring organization staff; preventive maintenance requirements, ease of maintenance
and general reliability play crucial roles in the selection of support equipment. The following examples
depict general categories of support equipment and typical features to look for when selecting this
equipment. This list is meant to guide agencies in the selection of equipment and does not represent
required specifications.
4
40 CFR Part 53.9 (c)
QA Handbook Vol II, Section 11.0
Revision No: 0
Date: 05/13
Page 3 of 7
Calibration Standards: Calibration standards fall into several categories:
- mass flow controlled (MFC) devices;
- standards that meet the 2012 Traceability Protocol for Gaseous Calibration Standards
5
- permeation devices;
- voltage standards for equipment testing;
- photometers;
- flow measurement devices;
- barometric pressure measurement devices; and
- temperature measurement devices.
It is recommended that the devices be 110 VAC, be compatible with data acquisition systems for
automated calibrations, and have digital compatibility or true transistor-transistor logic (TTL).
The most common standards are MFC devices and permeation devices. Both use dilution air to
obtain the needed output pollutant concentration.
Data Acquisition Systems (DAS): DAS should have at least 32-bit logic for improved
performance (DAS with at least 16-bit logic can still be used); have modem and internet
capabilities; allow remote access and control; allow for digital input; and be able to initiate
automated calibrations and polling. It is also recommended that DAS have software compatible
with AQS and AQI reporting and editing. Both data loggers and analog chart recorders may be
used for recording data; however, the storage, communicability, and flexibility of DAS coupled
with data loggers makes the DAS systems the preferred option. More information on DAS is
found in Section 14.
Instrument Racks: Instrument racks should be constructed of steel and be able to accept sliding
trays or rails. Open racks help to keep instrument temperatures down and allow air to circulate
freely.
Instrument Benches: Instrument benches should be of sufficient space to allow adequate room
for multiple instruments with room to work and be capable of supporting a fair amount of weight
(>100 lbs). Slate or other hard, water-proof materials (e.g., steel) are recommended.
Zero Air Systems and Standards: Zero air systems should be able to deliver 10 liters/min of air
that is free of ozone, NO, NO
2
, and SO
2
to 0.001 ppm and CO and non-methane hydrocarbons to
0.1 ppm or below the instruments method detection limit (whichever is lower). With NCore
monitoring and the use of trace gas monitors, there may be a need to audit and calibrate at lower
level. Therefore monitoring organization may need to acquire zero air systems capable to
delivering zero air at 20 to 30 liters/min. There are many commercially available systems;
however, simple designs can be obtained by using a series of canisters. In addition, the 2012
Traceability Protocol for Gaseous Calibration Standards includes a discussion of zero gas
standards which are commercially available. Although not required for use under protocol gasses,
the standards can be used as a check on zero air systems.
5
EPA Traceability Protocol for Assay and Certification of Gaseous Calibration Standards (EPA-600/R-23/531)
http://www.epa.gov/nrmrl/appcd/mmd/db-traceability-protocol.html
QA Handbook Vol II, Section 11.0
Revision No: 0
Date: 05/13
Page 4 of 7
11.1.3 Laboratory Support
While it is not required, monitoring organizations should employ full laboratory facilities. These facilities
should be equipped to test, repair, troubleshoot, and calibrate all analyzers and support equipment
necessary to operate the ambient air monitoring network. In cases where individual laboratories are not
feasible, a monitoring organization may be able to find a central laboratory (PQAO) where these activities
can be performed.
It is recommended that the laboratory be designed to accommodate the air quality laboratory/shop and
PM
10
and PM
2.5
filter rooms
6
, as well as enforcement instrumentation support activities. The air quality
portion consists of several benches flanked by instrument racks. One bench and rack are dedicated to
ozone traceability. The other instrument racks are designated for calibration and repair. A room should
be set aside to house spare parts and extra analyzers.
A manifold/sample cane should be mounted behind the bench. If possible, a sample cane that passes
through the roof to allow analyzers that are being tested to sample outside air should be mounted to the
bench. This configuration also allows any excess calibration gas to be exhausted to the atmosphere. It is
recommended that the pump room be external to the building to eliminate noise.
Each bench area should have an instrument rack attached to the bench. The instrument rack should be
equipped with sliding trays or rails that allow easy installation of instruments. If instrumentation needs to
be repaired and then calibrated, these activities can be performed on the bench top or within the rack.
Analyzers then can be allowed to warm up and be calibrated by a calibration unit. Instruments that are to
be tested are connected to the sample manifold and allowed to sample air in the same manner as if the
analyzer were being operated within a monitoring station. The analyzer is connected to an acquisition
system (e.g., DAS, data logger, chart recorder, etc.) and allowed to operate. Any intermittent problems
that occur can be observed on the data logger/chart recorder. The analyzer can be allowed to operate over
several days to see if anomalies or problems reoccur; if they do, there is a record of them. If the
instrument rack has a DAS and calibrator, nightly auto QC checks can be performed to see how the
analyzer reacts to known gas concentrations. In addition, the ozone recertification bench and rack should
be attached to a work bench. The rack should house the local ozone level 2 standard
7
and the ozone
transfer standards (level 3 and greater) that are being checked for recertification. Zero air is plumbed into
this rack for the calibration and testing of ozone analyzers and transfer standards.
During FRM/FEM testing EPA tries to ensure that monitoring equipment manufacturers test instruments
a varying environmental extremes. However within the period of testing some extremes that exist in
some monitoring areas may not be achieved. Monitoring organizations that have large regions with
varying extremes of temperature, humidity and pressure may want to invest in an environmental chamber
that can be used to test monitoring instruments against the manufactures advertized performance
standards.
11.2 Preventive Maintenance
Every monitoring organization should develop a preventive maintenance program. Preventive
maintenance is what its name implies; maintaining the equipment within a network to prevent downtime
and costly repairs and data loss. Preventive maintenance is an ongoing element of quality control and is
6
Guidance on filter room requirements can be found in methods 2.10 and 2.11 for PM
10
and 2.12 for PM
2.5
7
http://www.epa.gov/ttn/amtic/files/ambient/qaqc/OzoneTransferStandardGuidance.pdf
QA Handbook Vol II, Section 11.0
Revision No: 0
Date: 05/13
Page 5 of 7
typically enveloped into the daily routine. In addition to the daily routine, scheduled activities must be
performed monthly, quarterly, semi-annually and annually. Often the standard operating procedures
and/or operation manuals will provide preventative maintenance activities for the particular
instrument/method. It is suggested that these sections could be assembled into a preventative
maintenance document that could be kept at each site and accessed electronically so that maintenance can
be implemented and documented in a consistent manner.
Preventive maintenance is the responsibility of the station operators and the supervisory staff. It is
important that the supervisor review the preventive maintenance work and continually check the schedule.
The supervisor is responsible for making sure that preventive maintenance is being accomplished in a
timely manner. Preventive maintenance is not a static process; procedures must be updated for many
reasons, including, but not limited to, new models or types of instruments and new or updated methods.
The preventive maintenance schedule is changed whenever an activity is completed or performed at an
alternate time. For instance, if a multipoint calibration is performed in February instead of on the
scheduled date in March, then the subsequent six-month calibration date moves from September to
August. On a regular basis, the supervisor should review the preventive maintenance schedule with the
station operators. Following all repairs, the instruments must be verified (multi-point) or calibrated.
Lists can facilitate the organization and tracking of tasks and improve the efficiency of preventive
maintenance operations. A checklist of regular maintenance activities (e.g., periodic zero-span checks,
daily routine checks, data dump/collection, calibrations, etc.) is recommended. A spare parts list,
including relevant catalog numbers, is also recommended, as it facilitates the ordering of replacement
parts. Such a list should be readily accessible and should include the types and quantities of spare parts
already on-hand.
11.2.1 Station Maintenance
Station maintenance is an element of preventive maintenance that does not occur on a routine basis;
rather, these tasks usually occur on an as needed basis. Station maintenance items are checked monthly
or whenever an agency knows that the maintenance needs to be performed. Examples of station
maintenance items include:
floor cleaning;
shelter inspection;
security inspection fencing, locks, surveillance cameras, lighting;
visual inspection of probes, and met gear
air conditioner repair;
AC filter replacement;
weed abatement and grass cutting;
roof repair;
general cleaning;
inlet and manifold cleaning;
manifold exhaust blower lube;
desiccant replacement; and
ladder, safety rails, safety inspection, if applicable.
Simple documentation of these activities, whether in station logs or electronic logs, helps provide
evidence of continuous attention to data quality.
QA Handbook Vol II, Section 11.0
Revision No: 0
Date: 05/13
Page 6 of 7
11.2.2 Routine Operations
Routine operations are the checks that occur at specified periods of time during a monitoring station visit.
These duties must be performed and documented in order to operate a monitoring network at optimal
levels. Examples of typical routine operations are detailed in Table 11-1.
Table 11-1 Routine Operation Checks
Item Each Visit Weekly/Monthly Minimum
Observation of unusual
conditions/events
X
Review Data X
Mark charts, where applicable X
Check/Oil Exhaust Blower X
Check Exterior X
Check/Change Desiccant X
Manifold Leak Test X
Inspect tubing X
Replace Tubing Annually
1
Inspect manifold and cane X
Clean manifold and cane Every 6 months or as needed
Check HVAC systems X
Check electrical connections X
Field site supply inventory X
Residence time calculation If manifold and inlets altered
1
If tubing is used externally as an inlet device it may need to be replaced every 6 months or more frequently depending upon site
specific issues.
In addition to these items, the exterior of the building, sample cane, meteorological instruments and
tower, entry door, electrical cables, and any other items deemed necessary to check, should be inspected
for wear, corrosion, and weathering. Costly repairs can be avoided in this manner.
11.2.3 Instrument Logs and Site Logs
Each instrument and piece of support equipment (with the exception of the instrument racks and benches)
should have an Instrumentation Repair Log (either paper or electronic). The log should contain the repair
and calibration history of that particular instrument. Whenever multipoint verification/calibration,
instrument maintenance, repair, or relocation occurs, detailed notes are written in the instrumentation log.
The log contains the most recent multipoint verification/calibration report, a preventive maintenance
sheet, and the acceptance testing information or reference to the location of this information. If an
instrument is malfunctioning and a decision is made to relocate that instrument, the log travels with that
device. The log can be reviewed by staff for possible clues to the reasons behind the instrument
malfunction. In addition, if the instrument is shipped to the manufacturer for repairs, it is recommended
that a copy of the log be sent with the instrument. This information helps non-agency repair personnel
with troubleshooting instrument problems. Improper recording of instrument maintenance can complicate
future repair and maintenance procedures. The instrument log should be detailed enough to determine
easily and definitively which instrument was at which site(s) over any given time period. If a problem is
found with a specific instrument, the monitoring staff should be able to track the problem to the date it
initially surfaced and invalidate data even if the instrument was used at multiple sites.
A site log should be kept documenting maintenance of a specific monitoring site and the auxiliary
monitoring equipment located there. Information that could be recorded includes the activities listed in
the Station Maintenance and Routine Operations sections (Sections 11.2.1 and 11.2.2).
QA Handbook Vol II, Section 11.0
Revision No: 0
Date: 05/13
Page 7 of 7
The site log is a chronology of the events that occur at the monitoring station. The log is an important
part of station maintenance because it contains the narrative of past problems and solutions to those
problems. Site log notes should be written in the form of a narrative, rather than shorthand notes or
bulleted lists. Examples of items that should be recorded in the site log are:
the date, time, and initials of the person(s) who have arrived at the site;
brief description of the weather (e.g., clear, breezy, sunny, raining);
brief description of exterior of the site. Any changes that might affect the data should be recorded
for instance, if someone is parking a truck or tractor near the site, this note may explain high
NO
x
values;
any unusual noises, vibrations, or anything out of the ordinary;
records of any station maintenance or routine operations performed;
description of the work accomplished at the site (e.g., calibrated instruments, repaired analyzer);
and
detailed information about the instruments that may be needed for repairs or troubleshooting.
It is not required that the instrument and site logs be completely independent of each other. However,
there is an advantage to having separate instrument logs. If instruments go in for repair, they may
eventually be sent to another site. Having a separate instrument log allows the log to travel with the
instrument. Keeping electronic instrument and station maintenance logs at stations and at centralized
facilities (see LIMS discussion Section 8) also has record keeping advantages, but there needs to be a way
that these records can be considered official and not be tampered with or falsified. Newer electronic
signature technologies are helping ensure that electronic records can be considered official. It is
important, however, that all of the required information for each instrument and site be properly recorded
using a method that is comprehensive and easily understood. Many monitoring organizations have
developed standard station maintenance forms that contain all the items to be checked and the frequency
of those checks. It then becomes a very simple procedure to use this form to check off and initial the
activities that were performed.
QA Handbook Vol II, Section 12.0
Revision No: 0
Date: 05/13
Page 1 of 10
12.0 Calibrations
Calibration is defined as:
the comparison of a measurement standard, instrument, or item with a standard or instrument of
higher accuracy to detect and quantify inaccuracies and to report or eliminate those
inaccuracies by adjustment
1
.
Prior to the implementation of any ambient air monitoring activities, the sampling and analysis equipment
must be checked to assure it is within calibration tolerances, and if it fails these tolerances, must be
appropriately calibrated. This function is most routinely carried out at the field monitoring location.
1
American National Standard Quality Systems for Environmental Data and Technology Programs ANSI /ASQ E4
http://www.asq.org/
Calibration of an analyzer or instrument establishes the quantitative relationship between the actual value
of a standard, be it a pollutant concentration, a temperature, or a mass value (in ppm,
o
C or g, etc.) and
the analyzer's response (chart recorder reading, output volts, digital output, etc.). This relationship is
used to convert subsequent analyzer response values to corresponding concentrations. Once an
instruments calibration relationship is established, it is checked at reasonable frequencies to verify that
it remains in calibration.
Verification Versus Calibration
Since the term calibration is associated with an adjustment in either the instrument or software, these
adjustments should be minimized as much as possible. Sometimes performing frequent adjustments to
provide the most accurate data possible can be self-defeating and be the cause of additional
measurement uncertainty. For example, adjusting an instrument based upon a standard that might be
degrading or contaminated may actually cause data to be farther from the true concentration. Therefore,
quality control procedures that include measurements (e.g., 1-point QC, flow rate verifications, etc.) and
multi-point verifications are considered checks without correction and are used to ensure the
instruments are within the calibration tolerances. Usually these tolerances have been developed so that as
long as the instrument is within these tolerances, adjustments do not need to be made. However,
verifications should be implemented at reasonable frequencies to avoid invalidating significant amounts
of data.
NOTE: When the term calibration is used in the remainder of this section, it is assumed
that a multi-point verification is initially performed and the operator has concluded that
calibration (adjustment) is necessary.
NOTE: EPA does not recommend post-processing of data to correct for data failing one
point or multi-point verifications. For example, if after failure of a one-point QC check a
subsequent verification and calibration found that data was biased high by 15% the
previous routine data up until the last acceptable 1-point QC check is not adjusted down by
15% and reported. Based upon validation criteria, the data is either reported as initially
measured or invalidated.
QA Handbook Vol II, Section 12.0
Revision No: 0
Date: 05/13
Page 2 of 10
Each analyzer should be calibrated as directed by the analyzer's operation or instruction manual and in
accordance with the general guidance provided here. For reference methods for CO, NO
2
, SO
2
and O
3
,
detailed calibration procedures may also be found in the appropriate reference method Appendix in 40
CFR Part 50
2
and the method guidance and technical assistance documents listed in the fact sheets in
Appendix A.
Calibrations should be carried out at the field monitoring site by allowing the analyzer to sample test
atmospheres containing known pollutant concentrations. In the case of PM and Pb monitors where
concentration standards are not available and impractical, calibrations take place on flow, temperature
and pressure devices as best as possible. At times this may need to be accomplished in laboratory
settings rather than the field. The analyzer to be calibrated should be in operation for at least several
hours (preferably overnight) prior to the calibration so that it is fully warmed up and its operation has
stabilized. During the calibration, the analyzer should be operating in its normal sampling mode, and it
should sample the test atmosphere through all filters, scrubbers, conditioners, and other components used
during normal ambient sampling and through as much of the ambient air inlet system as is practicable.
All operational adjustments to the analyzer should be completed prior to the calibration (see section
12.7). Some analyzers can be operated on more than one range. For sites requiring the use of FRM or
FEMs (NAAQS sites), the appropriate ranges are identified in the Designated Reference and Equivalent
Method List found on AMTIC
3
. Analyzers that will be used on more than one range or that have auto-
ranging capability should be calibrated separately on each applicable range.
2
http://www.access.gpo.gov/nara/cfr/cfr-table-search.html
3
http://www.epa.gov/ttn/amtic/criteria.html
Calibration documentation should be maintained with each analyzer and also in a central backup file.
Documentation should be readily available for review and should include calibration data, calibration
equation(s) (and curve, if prepared), analyzer identification, calibration date, analyzer location,
calibration standards used and their traceability, identification of calibration equipment used, and the
person conducting the calibration.
Full Scale vs. Calibration Scale
Many older documents and some of the CFR reference methods refer to calibration at full
scale. The interpretation of this meant that monitoring organizations would calibrate to full
scale of one of the FRM/FEM approved operating range(s) of the instrument. For example,
ozone instruments are approved at 0-500 ppb or 0-1000 ppb. Many monitoring organization
calibrate the instrument by evenly spacing four upscale points up to around 500 ppb. In this
scenario, with most sites reading less than 80 ppb, the majority of the upscale calibration points
would be at levels not measured in ambient conditions. EPA suggests monitoring organization
calibrate using points that are more applicable to the concentrations found in their networks
while still be protective of concentrations exceeding the NAAQS. Using this procedure more
points can be used to calibrate the instruments at these lower concentration levels and better
inform monitoring organizations of stability. For convenience, EPA will use the term
calibration scale to refer to the concentration range used for calibrating the monitoring
instruments. Section 10 provides more details on this process.
QA Handbook Volume II, Section 12.0
Revision No: 0
Date: 05/13
Page 3 of 10
12.1 Calibration Standards and Reagents
Calibration standards are:
Reagents of high grade
Gaseous standards of known concentrations that are certified as EPA protocol gasses
Instruments and or standards of high sensitivity and repeatability.
12.1.1 Reagents
In some cases, reagents are prepared prior to sampling. Some of these reagents will be used to calibrate
the equipment, while others will become an integral part of the sample itself. In any case, their integrity
must be carefully maintained from preparation through analysis. If there are any doubts about the
method by which the reagents for a particular test were prepared or about the competence of the
laboratory technician preparing them, the credibility of the ambient air samples and the test results will
be diminished. It is essential that a careful record be kept listing the dates the reagents were prepared,
by whom, and their locations at all times from preparation until actual use. Prior to the test, one
individual should be given the responsibility of monitoring the handling and the use of the reagents.
Each use of the reagents should be recorded in a field or lab notebook.
Chemical reagents, solvents, and gases are available in various grades. Reagents can be categorized
into the following six grades
4
:
1. Primary standard - Each lot is analyzed, and the percentage of purity is certified.
2. Analyzed reagents- Can fall into 2 classes: (a) each lot is analyzed and the percentages of
impurities are reported; and (b) conformity with specified tolerances is claimed, or the
maximum percentages of impurities are listed.
3. USP and NF Grade - These are chemical reference standards where identity and strength
analysis are ensured.
4. Pure, c.p., chemically pure, highest purity - These are qualitative statements for
chemicals without numerical meaning.
5. Pure, purified, practical grades - These areusually intended as starting substances
for laboratory syntheses.
6. Technical or commercial grades - These are chemicals of widely varying purity.
The reference and equivalent methods define the grades and purities needed for the reagents and gases
required in the Ambient Air Quality Monitoring Program.
All reagent containers should be properly labeled either with the original label or, at a minimum, the
reagent, date prepared, expiration date, strength, preparer, and storage conditions. Leftover reagents
used during preparation or analysis should never be returned to bottles.
4
Quality Assurance Principles for Analytical Laboratories, 3rd Edition. By Frederick M. Garfield, Eugene
Klesta, and J erry Hirsch. AOAC International (2000). http://www.aoac.org/
QA Handbook Volume II, Section 12.0
Revision No: 0
Date: 05/13
Page 4 of 10
12.1.2 Gaseous Standards
In general, ambient monitoring instruments should be calibrated by allowing the instrument to sample
and analyze test atmospheres of known concentrations of the appropriate pollutant in air. The following
is an excerpt from 50 CFR Part 58, Appendix A Section 2.6.1:
Gaseous pollutant concentration standards (permeation devices or cylinders of compressed gas)
used to obtain test concentrations for carbon monoxide (CO), sulfur dioxide (SO2), nitrogen oxide
(NO), and nitrogen dioxide (NO2) must be traceable to either a National Institute of Standards
and Technology (NIST) Traceable Reference Material (NTRM) or a NIST-certified Gas
Manufacturers Internal Standard (GMIS), certified in accordance with one of the procedures
given in reference 4 of this appendix. Vendors advertising certification with the procedures
provided in reference 4 of this appendix and distributing gasses as EPA Protocol Gas must
participate in the EPA Protocol Gas Verification Program or not use EPA in any form of
advertising.
"Traceable" is defined in 40 CFR Parts 50 and 58 as meaning that a local standard has been compared
and certified, either directly or via not more than one intermediate standard, to a primary standard such
as a National Institute of Standards and Technology Standard Reference Material (NIST SRM) or a
USEPA/NIST-approved Certified Reference Material (CRM). Normally, the working standard should
be certified directly to the SRM or CRM, with an intermediate standard used only when necessary.
Direct use of a CRM as a working standard is acceptable, but direct use of an NIST SRM as a working
standard is discouraged because of the limited supply and expense of SRM's. At a minimum, the
certification procedure for a working standard should:
establish the concentration of the working standard relative to the primary standard;
certify that the primary standard (and hence the working standard) is traceable to a NIST primary
standard;
include a test of the stability of the working standard over several days; and
specify a recertification interval for the working standard.
Certification of the working standard may be established by either the supplier or the user of the
standard. As describe in CFR, gas supplier advertising EPA Protocol Gas will be required to
participate in the EPA Protocol Gas Verification Program. Information on this program, including the
gas supplier participating in the program, can be found on AMTIC
5
. EPA has developed procedures
for the establishment of protocol gasses in the document: EPA Traceability Protocol for Assay and
Certification of Gaseous Calibration Standards
6
. Table 2-3 in the Traceability Document provides the
maximum certification periods for verification and calibration standards used in the ambient air
program. Since these periods sometimes change the table is not presented here.
Test concentrations of ozone must be traceable to a primary standard (see discussion of primary
standards below) UV photometer as described in 40 CFR Part 50, Appendix D and the guidance
document: Transfer Standards for the Calibration of Ambient Air Monitoring Analyzers for Ozone
7
.
5
http://www.epa.gov/ttn/amtic/
6
http://www.epa.gov/nrmrl/appcd/mmd/db-traceability-protocol.html
7
http://www.epa.gov/ttn/amtic/files/ambient/qaqc/OzoneTransferStandardGuidance.pdf
For ambient air monitoring activities zero concentrations can be acquired through zero air generation
devices or purchased as standards. Although zero concentrations are not required to be traceable to a
QA Handbook Volume II, Section 12.0
Revision No: 0
Date: 05/13
Page 5 of 10
primary standard, care should be exercised to ensure that zero device or standards used are adequately
free of all substances likely to cause a detectable response from the analyzer and at a minimum, below the
lower detectable limit of the criteria pollutants being measured. Periodically, several different and
independent sources of zero should be compared. The one that yields the lowest response can usually
(but not always) be assumed to be the best zero device/standard. If several independent zero
device/standards produce exactly the same response, it is likely that all the standards are adequate.
Certification periods decrease for concentrations below the applicable concentration ranges provide in
Table 12-1. For example the certification period for SO
2
standards between 13-40 ppm is 6 months.
Also, tank size may affect stability in low level standards. Some gas manufacturers claim that standards
supplied in smaller tanks are stable for longer periods of time then the same concentration in larger tanks.
Although this claim has not been verified, if true it may be helpful in making purchasing decisions.
Primary Reference Standards
A primary reference standard can be a defined measurement standard designated for the calibration
of other measurement standards for quantities of a given kind in a given organization
8
. NISTs standard
reference material (SRMs) are examples of primary reference standards. NIST also describes a Primary
Reference Standard as a standard that is designated or widely acknowledged as having the highest
metrological qualities and whose value is accepted without reference to other standards of the same
quality. For example, the NIST-F1 Atomic Clock
9
is recognized as a primary standard for time and
frequency. A true primary standard like NIST-F1 establishes maximum levels for the frequency shifts
caused by environmental factors. By summing or combining the effects of these frequency shifts, it is
possible to estimate the uncertainty of a primary standard without comparing it to other standards. NIST
maintains a catalog of SRMs that can be accessed through the Internet
10
. Primary reference standards are
usually quite expensive and are often used to calibrate, develop, or assay working or secondary standards.
In order to establish and maintain NIST traceability the policies posted at the NIST Website
11
should be
observed.
It is important that primary reference standards are maintained, stored, and handled in a manner that
maintains their integrity. These standards should be kept under secure conditions and records should be
maintained that document chain of custody information.
12.1.3 Instruments
The accuracy of various measurement devices in sampling and continuous instruments is very important
to data quality. For example, in order to produce the correct flow rate to establish an accurate PM
2.5
cut
point, the temperature and barometric pressure sensors, as well as the flow rate device, must be producing
accurate measurements. Table 12-1 provides some of the more prevalent instruments that need to be
calibrated at a minimum annually or when shown through various verification checks to be out of
acceptable tolerances. In addition, the audit standards used to implement the checks and calibrations
should be certified annually in order to establish their accuracy and traceability to higher standards.
8
definition of reference measurement standard from International vocabulary of metrology Basic and general
concepts and associated terms (VIM) http://www.bipm.org/en/publications/guides/vim.html
9
http://www.nist.gov/pml/div688/grp50/primary-frequency-standards.cfm
10
http://www.nist.gov
11
http://ts.nist.gov/traceability/
QA Handbook Volume II, Section 12.0
Revision No: 0
Date: 05/13
Page 6 of 10
Table 12-1 Instruments and Devices Requiring Calibration and Certifications.
Criteria Acceptable Range
40 CFR
Reference
Verification/Calibration of devices in sampler/analyzer/laboratory against an authoritative standard
Barometric Pressure 10 mm Hg Part 50, App.L, Sec 9.3
Temperature
2
C of standard
Part 50, App.L, Sec 9.3
Flow Rate 2% of transfer standard Part 50, App.L, Sec 9.2
Design Flow Rate Adjustment 2% of design flow rate Part 50, App.L, Sec 9.2.6
Clock/timer Verification 1 min/mo Part 50, App.L, Sec 7.4
Mirobalance Calibration Readability 1 g
Repeatability 1 g
Part 50, App.L, Sec 8.1
Verification/Calibration of devices in shelter or lab against an authoritative standard
Lab Temperature
2
C
not described
Lab Humidity 2% not described
Mirobalance Calibration Readability 1 g
Repeatability 1 g
Part 50, App.L, Sec 8.1
Verification/calibration standards requiring certification annually
Standard Reference
Photometer (SRP)
4% or 4 ppb (whichever greater)
RSD of six slopes 3.7%
not described
SRP recertification to local
primary standard
Std. Dev. of 6 intercepts 1.5
New slope =+0.05% of previous
not described
Flow rate 2% of NIST Traceable Standard Part 50, App L Sec 9.2
Pressure 1 mm Hg resolution, 1 mm Hg
accuracy
not described
Temperature
0.1
C 1
mm Hg accuracy
not described
Gravimetric Standards 0.025 mg not described
12.2 Multi-point Verifications/Calibrations
Multi-point calibrations consist of a zero and 4 upscale points, the highest being a concentration above the
NAAQS (for SLAMS criteria pollutants) and higher than any routine values one might expect at the site.
This is defined as the calibration scale for the instrument and is different then what has been traditionally
defined as the full scale operating range defined in the FRM/FEM approval documentation. Multi-
point calibrations are used to establish or verify the linearity of analyzers upon initial installation, after
major repairs and at specified frequencies. Most modern analyzers have a linear or very nearly linear
response with concentration. If a non-linear analyzer is being calibrated, additional calibration points
should be included to adequately define the calibration relationship, which should be a smooth curve.
Calibration points should be plotted or evaluated statistically as they are obtained so that any deviant
points can be investigated or repeated immediately.
Most analyzers have zero and span adjustment controls, which should be adjusted based on the zero and
highest test concentrations, respectively, to provide the desired scale range within the analyzer's
QA Handbook Volume II, Section 12.0
Revision No: 0
Date: 05/13
Page 7 of 10
specifications (see section 12.5). For analyzers in routine operation, unadjusted (''as is") analyzer zero
and span response readings should be obtained prior to making any zero or span adjustments.
NO/NO
2
/NO
x
analyzers may not have individual zero and span controls for each channel; the analyzer's
operation/instruction manual should be consulted for the proper zero and span adjustment procedure.
Zero and span controls often interact with each other, so the adjustments may have to be repeated several
times to obtain the desired final adjustments.
After the zero and span adjustments have been completed and the analyzer has been allowed to stabilize
on the new zero and span settings, all calibration test concentrations should be introduced into the
analyzer for the final calibration. The final, post-adjusted analyzer response readings should be obtained
from the same device (chart recorder, data acquisition system, etc.) that will be used for subsequent
ambient measurements. The analyzer readings are plotted against the respective test concentrations, and
the best linear (or nonlinear if appropriate) curve to fit the points is determined. Ideally, least squares
regression analysis (with an appropriate transformation of the data for non-linear analyzers) should be
used to determine the slope and intercept for the best fit calibration line of the form, y =mx +b, where y
represents the analyzer response, x represents the pollutant concentration, m is the slope, and b is the x-
axis intercept of the best fit calibration line. When this calibration relationship is subsequently used to
compute concentration measurements (x) from analyzer response readings (y), the formula is transposed
to the form, x =(y - b)/m.
For the gaseous pollutants,
the verification/calibration
is considered acceptable if
all calibration points fall
within 2% (or an absolute
difference) of the
calibration scale, best fit
straight line. Which
acceptance criteria
(percent or absolute
difference) is used
depends on the
concentration of the
calibration points. EPA
has developed a Data
Assessment Statistical
Calculator (DASC)
12
tool
that automates this process
for data evaluation (See
Fig. 12.1). For manual
samplers the flow rate,
temperature, pressure
devices are checked at
different settings.
Acceptance criteria for
these devices can be found
12
http://www.epa.gov/ttn/amtic/qareport.html
QA Handbook Volume II, Section 12.0
Revision No: 0
Date: 05/13
Page 8 of 10
in the MQO Tables in Appendix D.
As a quality control check on calibrations, the standard error or correlation coefficient can be calculated
along with the regression calculations. A control chart of the standard error or correlation coefficient
could then be maintained to monitor the degree of scatter in the calibration points and, if desired, limits of
acceptability can be established.
12.3 Frequency of Calibration and Analyzer Adjustment
An analyzer should be calibrated (or recalibrated):
upon initial installation,
following physical relocation,
after any repairs or service that might affect its calibration,
following an interruption in operation of more than a few days,
upon any indication of analyzer malfunction or change in calibration, and
at some routine interval (multi-point verification, see below).
When calibration relationships are applied to analyzer responses to determine actual concentrations, it is
suggested that the analyzer undergo multi-point verification/calibration periodically to maintain close
agreement. The frequency of this routine periodic recalibration is a matter of judgment and is a tradeoff
among several considerations, including: the inherent stability of the analyzer under the prevailing
conditions of temperature, pressure, line voltage, etc., at the monitoring site; the cost and inconvenience
of carrying out the calibrations; the quality of the ambient measurements needed; the number of ambient
measurements lost during the calibrations; and the risk of collecting invalid data because of a malfunction
or response problem with the analyzer that wouldn't be discovered until a calibration is carried out.
When a new monitoring instrument is first installed, zero/span and one point QC checks should be very
frequent, perhaps daily or 3 times per week, because little or no information is available on the drift
performance of the analyzer. With the advancement in data acquisition system technology, many
monitoring organizations are running these QC checks daily. However, the QC checks are required to be
implemented every two weeks. Information on another unit of the same model analyzer may be useful;
however, individual units of the same model may perform quite differently. After enough information on
the drift performance of the analyzer has been accumulated, the calibration frequency can be adjusted to
provide a suitable compromise among the various considerations mentioned above.
To facilitate the process of determining calibration frequency, it is strongly recommended that control
charts be used to monitor the zero/span and one-point QC drift performance of each analyzer. Control
charts can be constructed in different ways, but the important points are to visually represent and
statistically monitor drift, and to be alerted if the drift becomes excessive so that corrective action can be
taken. Such control charts make important use of the unadjusted zero and span response readings.
12.4 Adjustments to Analyzers
Ideally, all ambient measurements obtained from an analyzer should be calculated on the basis of the
most current multipoint calibration or on the basis of both the previous and subsequent calibrations (see
Section 12.5). Some acceptable level of drift (i.e., deviation from an original or nominal response curve)
QA Handbook Vol II, Section 12.0
Revision No: 0
Date: 2/13
Page 9 of 10
can be allowed before physical adjustments (i.e., calibration) must be made because the calibration curve
used to calculate the ambient measurements is kept in close agreement with the actual analyzer response.
The chief limitations are the amount of change in the effective scale range of the analyzer that can be
tolerated and possible loss of linearity in the analyzer's response due to excessive deviation from the
design range. Cumulative drifts of up to 15 percent of full scale from the original or nominal zero and
span values may not be unreasonable, subject to the limitations mentioned above.
Due to the advancement in monitoring technologies, ambient air monitors are much more stable and
adjustments not as necessary. Earlier versions of this Handbook included sections for zero/span
calibrations as well as physical zero/span adjustments. Precise adjustment of the zero and span controls
may not be possible because of: (1) limited resolution of the controls, (2) interaction between the zero
and span controls, and (3) possible delayed reaction to adjustment or a substantial stabilization period
after adjustments are made. Precise adjustments may not be necessary because calibration of the analyzer
following zero and span adjustments will define the precise response characteristic (calibration curve).
EPA feels that frequent adjustments of instruments should not be necessary and may in fact lead to more
data quality uncertainty. EPA does not recommend span adjustments be made between multi-point
calibrations but zero adjustments are appropriate.
EPA is no longer including guidance suggesting that the calibration equation be updated after each
zero/span check and suggests the ambient readings be calculated from the most recent multipoint
calibration curve or from a fixed nominal or "universal" calibration curve (Section 12.5). In this case, the
zero and span checks serve only to measure or monitor the deviation (drift error) between the actual
analyzer response curve and the calibration curve used to calculate the ambient measurements.
Automatic Self-Adjusting Analyzers
Some air monitoring analyzers are capable of periodically carrying out automatic zero and span
calibrations and making their own zero and span self adjustments to predetermined readings. EPA
discourages the use of automatic span adjustments but considers automatic zero adjustments reasonable
when: 1) the automatic zero standards pass through the sample probe inlet and sample conditioning
system, 2) the zero test is performed every day, and 3) both the adjusted and unadjusted zero response
readings can be obtained from the data recording device. EPA does not suggest zero adjustments on
checks that occur every two weeks. However an adjustment does not mean a post processing correction
on zero (adjusting the previous 24 hours routine data based on the difference between the current zero
reading and the previous 24-hour reading). In fact, the automated zero does not correct routine data. The
zero is reset every 24 hours. EPA does not recommend making automatic or manual adjustments
(corrections) to the span. It is expected that the difference between the unadjusted and adjusted zero
response is negligible and not greater than the zero drift acceptance criteria listed in the validation
template (App D). Data invalidation and corrective action should occur if the differences between the
24-hour unadjusted and adjusted zero drift is greater than the validation template acceptance criteria. Data
loggers should be programmed to provide flags or warnings of this occurrence.
QA Handbook Vol II, Section 12.0
Revision No: 0
Date: 2/13
Page 10 of 10
12.5 Data Reduction Using Calibration Information
As noted previously, an analyzer's response calibration curve relates the analyzer response to actual
concentration units of measure, and the response of most analyzers tends to change (drift)
unpredictably with passing time. These two conditions must be addressed in the mechanism that is used
to process the raw analyzer readings into final concentration measurements. Two practical methods are
described below. They are listed in order of preference,
1) "Universal" Calibration--A fixed, "universal" calibration is established for the analyzer and used to
calculate all ambient readings. All verifications and checks are used to measure the deviation of the
current analyzer response from the universal calibration. Whenever this deviation exceeds the established
zero and span adjustment limits, the analyzer is recalibrated.
2) Major Calibration Update--In this method, the calibration slope and intercept used to calculate
ambient measurements are updated only for "major" calibration (i.e., semi-annual or annual multi-point
verification/calibrations). All ambient measurements are calculated from the most recent major
calibration. Between major calibrations, periodic zero and span verifications are used to measure the
difference between the most recent major calibration and the current instrument response. Physical or
automated adjustments of the zero may be appropriate however span adjustment to restore a match
between the current analyzer response and the most recent major calibration is not suggested. Whenever
this deviation exceeds the established zero and span adjustment limits, the analyzer is recalibrated.
12.6 Validation of Ambient Data Based on Calibration Information
When zero or span drift validation limits (see Figure 12.1) are exceeded, ambient measurements should
be invalidated back to the most recent acceptable zero/span/one-point QC check where such
measurements are known to be valid. Also, data following an analyzer malfunction or period of non-
operation should be regarded as invalid until the next subsequent calibration unless unadjusted zero and
span readings at that calibration can support its validity.
Documentation
All data and calculations involved in these calibration activities should be recorded in the instrument log
book described in Section 11.
QA Handbook Vol II, Section 13.0
Revision No: 0
Date: 05/13
Page 1 of 4
13.0 Inspection/Acceptance for Supplies and Consumables
Both field operations and laboratory operations need supplies and consumables. The focus of this section
is the management of laboratory and field sampling supplies and consumables. For information on the
actual field/lab supplies and consumables needed for any specific method, see the reference method in 40
CFR Part 50
1
, the general guidance methods and technical assistance documents on AMTIC
2
and the
manufacturers operations manuals. From this information, monitoring organizations, as part of the
QAPP requirements, will develop specific SOPs for its monitoring and analytical methods. One section of
the SOPs requires a listing of the acceptable supplies and consumables for the method.
Pollutant parameters are measured using electronic (e.g., continuous emission monitors, FTIRs, etc),
wet chemical techniques, or physical methods. Chemical analysis always involves the use of consumable
supplies that must be replaced on a schedule consistent with their stability and with the rate at which
samples are taken. Frequently used chemical methods require adequate supplies of chemicals for
operation (i.e. three months) so that the supplier can comply with the delivery schedules and there is no
downtime waiting for supplies. In some cases, analytical reagents for specific air contaminants
deteriorate rapidly and need protective storage. The following information may be helpful when
considering the use of these consumable items. Much of the information presented below is derived from
the document Quality Assurance Principles for Analytical Laboratories
3
.
13.1 Supplies Management
Control of supplies and consumables is important to the success of the quality assurance program. It is
important that specifications for each item are prepared and adhered to during the procurement process.
When specifications are prepared, the following points should be considered: identity, purity, potency,
source, tests to be conducted for quality and purity, need for further purification, storage and handling
procedures, and replacement dates. As part of supplies management, the following actions are
recommended:
establish criteria and specifications for the important supplies and consumables.
check and test the supplies and consumables against specifications, before placing them in use.
design and maintain a supplies management program to ensure the quality of reagents used in
day-to-day operations, paying particular attention to primary reference standards, working
standards, and standard solutions.
decide on the kinds of purified water that are necessary, and develop suitable tests and testing
intervals to ensure the quality of water used in analytical work and for cleaning glassware.
purchase only Class A volumetric glassware and perform calibrations and recalibrations that are
necessary to achieve reliable results.
establish procedures for cleaning and storing glassware/sample containers with due consideration
for the need for special treatment of glassware/sample containers used in trace analysis.
establish a useful life for glassware/sample containers and track this.
discard chipped and etched glassware or damaged containers.
1
http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title40/40tab_02.tpl
2
http://www.epa.gov/ttn/amtic/
3
Quality Assurance Principles for Analytical Laboratories, 3rd Edition. By Frederick M. Garfield, Eugene Klesta,
and J erry Hirsch. AOAC International (2000). http://www.aoac.org/
QA Handbook Vol II, Section 13.0
Revision No: 0
Date: 05/13
Page 2 of 4
13.2 Standards and Reagents
Discussions on gaseous standards and reagents are found in Section 12. What is most important is that
the standards and reagents used are of appropriate purity and certified within the acceptable limits of the
program for which they are used. Table 12-1 provides certification frequencies for gaseous standards, but
within these timeframes, and as new cylinders are purchased, monitoring organizations need to develop a
standard checking scheme to establish ongoing acceptance of standards. For example a new SRM should
be purchased months prior to the expiration (or need for recertification) or complete use of an older
standard in order to develop a overlapping cylinder acceptance process so there is some establishment of
traceability and consistency in monitoring. For example, if a new SRM is put into use in a monitoring
organization and all monitoring instruments traced to the cylinder start failing calibration, it may mean
that either the new or older cylinder was not properly certified or has integrity problems. By checking
both cylinders prior to new cylinder use, this issue can be avoided.
13.2.1 Standard Solutions
Most laboratories maintain a stock of standard solutions. The following information on these solutions
should be kept in a log book:
identity of solution
strength
method of preparation (reference to SOP)
standardization calculations
recheck of solution for initial strength
date made/expiration date
initials of the analyst
storage
As mentioned above, all standard solutions should contain appropriate labeling as to contents and
expiration dates.
13.2.2 Purified Water
Water is one of the most critical but most often forgotten reagent. The water purification process should
be documented from the quality of the starting raw water to the systems used to purify the water,
including how the water is delivered, the containers in which it is stored, and the tests and the frequency
used to ensure the quality of the water.
13.3 Volumetric Glassware
Use of the appropriate glassware is important since many preparations and analyses require the
development of reagents, standards, dilutions, and controlled delivery systems. It is suggested that
Class A glassware be used in all operations requiring precise volumes. SOPs requiring volumetric
glassware should specify the size/type required for each specific operation.
QA Handbook Vol II, Section 13.0
Revision No: 0
Date: 05/13
Page 3 of 4
13.4 Sample Containers
Samples may be contaminated by using containers that have not been properly cleaned and prepared (e.g.,
VOC canisters, particulate filter cassettes/containers) or purchased from vendors without proper
inspection prior to use. In addition, all sample containers have a useful life. Some containers, such as
the low volume PM sample filter cassettes can be damaged over time and cause leaks in the sampling
system. It is important to track the inventory of sampling containers from:
date of purchase;
first use;
frequency of use (estimate);
time of retirement.
An inventory of this type can help ensure new containers are purchased before the expiration date of older
containers. Use of appropriate sample containers is important since the matter of the container could
potentially affect the collected sample. Always refer to the specific method to see if a particular type of
container (e.g., high density polyethylene [HDPE] bottles, amber glass) is required for the storage of the
sample.
13.5 Particulate Sampling Filters
Filters are used for the manual methods for criteria pollutants (e.g., PM
10
, PM
2.5
, PM
10-2.5
, total PM, Pb,
etc.). No commercially available filter is ideal in all respects. The sampling program should determine
the relative importance of certain filter evaluation criteria (e.g., physical and chemical characteristics,
ease of handling, cost). The reference methods provide detailed acceptance criteria for filters. Some of
the basic criteria that must be met regardless of the filter type follows:
Visual inspection - for pinholes, tears, creases, or other flaws that may affect the collection
efficiency of the filter, which may be consistent through a batch. This visual inspection would
also be made prior to filter installation and during laboratory pre- and post-weighing to assure the
integrity of the filter is maintained throughout the data collection process.
Collection efficiency - greater than 99% as measured by DOP test (ASTM 2988) with
0.3 micrometer particles at the samplers operating face velocity.
Integrity - (pollutant specific) measured as the concentration equivalent corresponding to the
difference between the initial and final weights of the filter when weighed and handled under
simulated sampling conditions (equilibration, initial weighing, placement on inoperative sampler,
removal from a sampler, re-equilibration, and final weighing).
Alkalinity - less than 25 microequivalents/gram of filter of filter following at least two months of
storage at ambient temperature and relative humidity.
Note: Some filters may not be suitable for use with all samplers. Due to filter handling characteristics or
rapid increases in flow resistance due to episodic loading, some filters, although they meet the above
criteria, may not be compatible with the model of sampler chosen. It would be prudent to evaluate more
than one filter type before purchasing large quantities for network use. In some cases, EPA Headquarters
may have national contracts for acceptable filters that will be supplied to monitoring organizations.
QA Handbook Vol II, Section 13.0
Revision No: 0
Date: 05/13
Page 4 of 4
13.6 Field Supplies
Field instrumentation, which includes samplers and analyzers, require supplies for the actual collection
process as well as quality control activities and crucial operational maintenance. These supplies can
include, but are not limited to:
Gas standards/Permeation standards
HVAC units
Maintenance equipment (tools, ladders)
Safety supplies (first aid kit)
Information technology supplies (PC, printers, paper, ink, diskettes)
Sample line filters
Charcoal
Desiccant
Gaskets and O-rings
Sample lines and manifolds
Disposable gloves
Water/distilled water
Pumps and motors
Chart paper and ink
Impaction oil
TEOM FDMS filter
The site logbook discussed in Section 11 should include a list and inventory of these critical field
supplies. As part of routine maintenance activities, this inventory can be reviewed to determine if any
supplies are in need of restocking. If electronic logbooks are used, information from each site can be
aggregated at the field office to better assess needs and develop efficient ordering processes.
QA Handbook Vol II, Section 14
Revision: 0
Date: 05/13
Page 1 of 15
14.0 Data Acquisition and Information Management
Achieving air monitoring objectives depends, in part,
on collecting data that are:
reliable;
of known quality;
easily accessible to a variety of users; and
aggregated in a manner consistent with its
primary use
In order to accomplish this, information must be collected and managed in a manner that protects and
ensures its integrity. Data management is the development, execution and supervision of plans,
policies, programs and practices that control, protect, deliver and enhance the value of data and
information assets
1
.
Most of the data reported by the monitoring organization will be collected through automated systems.
These systems must be effectively managed according to a set of guidelines and principles designed to
ensure data integrity. The EPA document Good Automated Laboratory Practices (GALP)
2
defines six
data management principles that are worth reviewing:
1. Laboratory management must provide a method of assuring the integrity of all Laboratory
information management systems (LIMS) data. Communication, transfer, manipulation, and
the storage/recall process all offer potential for data corruption. The demonstration of control
necessitates the collection of evidence to prove that the system provides reasonable protection
against data corruption.
2. The formulas and decision algorithms employed by the LIMS must be accurate and
appropriate. Users cannot assume that the test or decision criteria are correct; those formulas
must be inspected and verified.
3. A critical control element is the capability to track LIMS Raw Data entry, modification, and
recording to the responsible person. This capability utilizes a password system or equivalent to
identify the time, date, and person or persons entering, modifying, or recording data.
4. Consistent and appropriate change controls, capable of tracking the LIMS operations and
software, are a vital element in the control process. All changes must follow carefully planned
procedures, be properly documented, and when appropriate include acceptance testing.
5. Procedures must be established and documented for all users to follow. Control of even the
most carefully designed and implemented LIMS will be thwarted if the user does not follow
these procedures. This principle implies the development of clear directions and SOPs, the
training of all users, and the availability of appropriate user support documentation.
6. The risk of LIMS failure requires that procedures be established and documented to
minimize and manage their occurrence. Where appropriate, redundant systems must be
1
http://www.dama.org/files/public/DI_DAMA_DMBOK_Guide_Presentation_2007.pdf DAMA-DMBOK Guide
(Data Management Body of Knowledge) Introduction & Project Status"
2
http://www.epa.gov/irmpoli8/archived/irm_galp/
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 2 of 15
installed and periodic system backups must be performed at a frequency consistent with the
consequences of the loss of information resulting from a failure. The principle of control must
extend to planning for reasonable unusual events and system stresses.
Although the GALP is written for LIMS, the principles listed above are applicable to ambient air
monitoring information management systems in the field and at the central office. This section provides
guidance in these areas, including identification of advanced equipment and procedures that are
recommended for implementation. The recommended procedures rely on digital communication by the
data acquisition system to collect a wider variety of information from the analyzers/samplers, to control
instrument calibrations, and to allow for more routine, automated, and thorough data quality efforts. The
section will discuss:
1. Data acquisition- collecting the raw data from the monitor/sampler, storing it for an appropriate
interval, aggregating or reducing the data, and transferring this data to final storage in a local data
base (monitoring organizations database)
2. Data transfer- preparing and moving data to external data bases such as AIRNow or the Air
Quality System (AQS).
3. Data management- the development, execution and supervision of plans, policies,
programs and practices that control, protect, deliver and enhance the value of data and
information assets
3
In response to guidelines issued by the Office of Management and Budget (OMB)
4
EPA developed the
document titled Guidelines for Ensuring and Maximizing the Quality, Objectivity, Utility, and Integrity of
Information Disseminated by the Environmental Protection Agency
5
. The Guideline contains EPAs
policy and procedural guidance for ensuring and maximizing the quality of information it disseminates.
The Guideline also incorporates the following performance goals:
Disseminated information should adhere to a basic standard of quality, including objectivity,
utility, and integrity.
The principles of information quality should be integrated into each step of EPAs development
of information, including creation, collection, maintenance, and dissemination.
Administrative mechanisms for correction should be flexible, appropriate to the nature and
timeliness of the disseminated information, and incorporated into EPAs information resources
management and administrative practices.
EPA suggests that monitoring organizations review this document since it is relevant to the ambient air
information it generates and can help to ensure that data can withstand challenges to its quality.
3
http://www.dama.org/i4a/pages/index.cfm?pageid=1
4
Section 515(a) of the Treasury and General Government Appropriations Act for Fiscal Year 2001 (Public Law 106-554; H.R.
5658),
5
http://www.epa.gov/quality/informationguidelines/documents/EPA_InfoQualityGuidelines.pdf
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 3 of 15
14.1 Data Acquisition Systems
Continuing advances in computer technology used in monitoring instruments and data loggers are:
increasing the volume of the pollutant data stream by enabling the capture of more finely, time-
resolved data
providing operational data about instruments that supports data validation and helps to reduce
data loss by identifying problems early, and
making more data available to users, sooner.
In order to take full advantage of these capabilities, data management software systems will need to
support efficient processing and validation of data and provide communication of that data in a format
and a timeframe that serves the needs of multiple users. An example of a benefit from using these
systems is the forecasting of pollution episodes with near real-time data captured from NCore and ozone
monitoring networks.
This section provides information on Data Acquisition Systems (DAS), a term used for systems that
collect, store, summarize, report, print, calculate or transfer data. The transfer is usually from an analog
or digital format to a digital medium. This section will also discuss limitations of data collected with
DAS.
14.1.1 Automated Data Acquisition Systems
DAS have been available to air quality professionals since the early 1980s. The first systems were single
and multi-channel systems that collected data on magnetic media. This media was usually hand
transferred to a central location or laboratory for downloading to a central computer. With the advent of
digital data transfer from the stations to a central location, the need to hand transfer data has diminished.
14.1.2 Instrument to Data logger
Figure 14.1 shows the basic transfer of data from the
instrument to the final product; a hard copy report, or
data transfer to a central computer. Most continuous
monitors have the ability to output data in at least two
ways: analog output and an RS232 digital port. Some
instrumentation may now be including USB, Ethernet
and firewire capability. The instrument usually uses
DC voltage. This voltage varies directly with the
concentration collected. Most instruments output is
a DC voltage in the 0-1 or 0-5 volts range. The
following provide a brief summary of the analog (A)
or digital (D) steps
(A) the voltage is measured by the
multiplexer which allows voltages from many
instruments to be read at the sametime.
(A) the multiplexer sends a signal to the a/d
converter which changes the analog voltage
Ambient
Instrument
Storage
Medium
RAM
Memory
CPU
analog/digital
converter
Multiplexer
On site
printer
Data logger
Analog
Signal
Digital
Signal
Modem
Local
central
computer
On-site
computer
Hard copy
report
Ambient
Instrument
Storage
Medium
RAM
Memory
CPU
analog/digital
converter
Multiplexer
On site
printer
Data logger
Analog
Signal
Digital
Signal
Modem
Local
central
computer
On-site
computer
Hard copy
report
Figure 14.1 DAS data flow
Ambient
Instrument
Storage
Medium
RAM
Memory
CPU
analog/digital
converter
Multiplexer
On site
printer
Data logger
Analog
Signal
Digital
Signal
Modem
Local
central
computer
On-site
computer
Hard copy
report
Ambient
Instrument
Storage
Medium
RAM
Memory
CPU
analog/digital
converter
Multiplexer
On site
printer
Data logger
Analog
Signal
Digital
Signal
Modem
Local
central
computer
On-site
computer
Hard copy
report
Figure 14.1 DAS data flow
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 4 of 15
to a low amperage digital signal.
(A) the a/d converter send signals to the central processing unit (cpu) that directs the digital
electronic signals to a display or to the random access memory (ram) which stores the short-term
data until the end of a pre-defined time period.
(A/D) the cpu then shunts the data from the ram to the storage medium which can be magnetic
tape, computer hard-drive or computer diskette.
(A/D) the computer storage medium can be accessed remotely or at the monitoring location.
The data transfer may occur via modem to a central computer storage area or printed out as hard copy. In
some instances, the datamay be transferred from one storage medium (i.e. hard drive to a diskette, tape,
or CD) to another storage medium. The use of a data logging device to automate data handling from a
continuous sensor is not a strict guarantee against recording errors. Internal validity checks are necessary
to avoid serious data recording errors. This can be accomplished by polling a period of data directly from
the monitor and comparing that data to data thats stored in the local central computer.
Analog Versus Digital DAS -
Most analyzers built within
the last 15 years have the
capability (RS232 ports) to
transfer digital signals, yet
many monitoring
organizations currently
perform data acquisition of
automated monitors by
recording an analog output
from each gas analyzer
using an electronic data
logger. As explained above,
the analog readings are
converted and stored in
digital memory in the data
logger for subsequent
automatic retrieval by a
remote data management
system. This approach can
reliably capture the
monitoring data, but does not
allow complete control of monitoring operations, and the recorded analog signals are subject to noise that
limits the detection of low concentrations. Furthermore, with the analog data acquisition approach, the
data review process is typically labor-intensive and not highly automated. For these reasons, EPA
encourages the adoption of digital data acquisition methods. In that regard, the common analog data
acquisition approach often does not fully utilize the capabilities of the electronic data logger. Many data
loggers have the capability to acquire data in digital form and to control some aspects of calibrations and
analyzer operation, but these capabilities are not utilized in typical analog data acquisition approaches.
Digital data acquisition reduces noise in the recording of gas monitoring data, thereby improving
sensitivity. It also records and controls the instrument settings, internal diagnostics, and programmed
Station Desktop
System
CO
PM
(continuous or FRM)
SO2
NOy
Calibrator
Zero Air Supply
R
S
-
2
3
2
M
u
l
t
i
-
d
r
o
p
(
w
i
t
h
i
n
m
a
n
u
f
a
c
t
u
r
e
r
)
MET
R
S
2
3
2
c
o
n
n
e
c
t
i
o
n
s
D
i
g
i
t
a
l
I
/
O
Dial-up/DSL
Cable
Satellite
Wireless
Status
Relay control
Measurements
Diagnostics
Data pushed or pulled
frommultiple stations
Database
Server
P
u
b
lic re
p
o
rtin
g
AQS
S
/L/T
validation
(optional)
Manifold/external valves
Figure 14.2 Flow of data from gas analyzers to final reporting
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 5 of 15
activities of monitoring and calibration equipment. Such data acquisition systems also typically provide
automated data quality assessment as part of the data acquisition process.
It may be cost-effective for monitoring organizations to adopt digital data acquisition and calibration
control simply by more fully exploiting the capabilities of their existing electronic data loggers. For
example, many gas analyzers are capable of being calibrated under remote control. The opportunity to
reduce travel and personnel costs through automated calibrations is a strong motivator for monitoring
organizations to make greater use of the capabilities of their existing data acquisition systems. The
NCore multi-pollutant sites are taking advantage of the newer DAS technologies. Details of these
systems can be found in the technical assistance document for this program
6
.
Figure 14.2 illustrates the recommended digital data acquisition approach for the NCore sites. It presents
the data flow from the gas monitors, through a local digital data acquisition system, to final reporting of
the data in various public databases. This schematic shows several of the key capabilities of the
recommended approach. A basic capability is the acquisition of digital data from multiple analyzers and
other devices, thereby reducing noise and minimizing the effort needed in data processing. Another
capability is two-way communication, so that the data acquisition system can interrogate and/or control
the local analyzers, calibration systems, and even sample inlet systems, as well as receive data from the
analyzers. Data transfer to a central location is also illustrated, with several possible means of that
transfer shown. Monitoring organizations are urged to take advantage of the latest technology in this part
of the data acquisition process, as even technologies such as satellite data communication are now well
established, commercially available, and inexpensive to implement for monitoring operations.
Depending on the monitoring objective, it may be important that data are reported in formats of
immediate use in public data bases such as AQS
7
, and the multi-monitoring organization AIRNow
8
sites.
An advantage of DAS software is the ability to facilitate the assembly, formatting and reporting of
monitoring data to these databases.
Digital data acquisition systems such as those in Figure 14.2 offer a great advantage over analog systems
in the tracking of calibration data, because of the ability to control and record the internal readings of gas
analyzers and calibration systems. That is, a digital data acquisition system not only can record the
analyzers output readings, but can schedule and direct the performance of analyzer calibrations, and
record calibrator settings and status. Thus, flagging of calibration data to distinguish them from ambient
monitoring data are conducted automatically during data acquisition with no additional effort or post-
analysis. These capabilities greatly reduce the time and effort needed to organize and quantify calibration
results.
14.1.3 DAS Quality Assurance/Quality Control
Most automated data acquisition systems support the acquisition of QC data like zero, one point QC, span
and calibration data. When QC data are acquired automatically by a data acquisition system for direct
computer processing, the system must be sufficiently sophisticated to:
ensure that the QC data are never inadvertently reported as ambient measurements,
6
Version 4 of the Technical Assistance Document for Precursor Gas Measurements in the NCore Multi-pollutant
Monitoring Network. http://www.epa.gov/ttn/amtic/ncore/guidance.html
7
http://www.epa.gov/ttn/airs/airsaqs/aqsweb/
8
http://airnow.gov/
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 6 of 15
ignore transient data during the stabilization period before the analyzer has reached a stable QC
response (this period may vary considerably from one analyzer to another),
average the stable QC readings over some appropriate time period so that the readings obtained
accurately represents the analyzers QC response,
ignore ambient readings for an appropriate period of time immediately following a QC reading
until the analyzer response has stabilized to the ambient-level concentration.
In relation to the DAS, quality assurance seeks to ensure that the DAS is being operated within defined
guidelines. Usually, this means that each value that is collected by the DAS is the same value that is
generated from the analyzer and reported to the Air Quality System (AQS) data base. This usually is
accomplished by DAS calibrations, and data trail audits.
Calibration- In the case where analog signals from monitoring equipment are recorded by the DAS, the
calibration of a DAS is similar to the approach used for calibration of a strip chart recorder. To calibrate
the DAS, known voltages are supplied to each of the input channels and the corresponding measured
response of the DAS is recorded. Specific calibration procedures in the DAS owners manual should be
followed when performing such DAS calibrations. For DAS that receive digital data from the
instruments, a full scale check (the instrument is in a mode and the output is at the full scale of the
instrument) should be performed to see if the data received digitally is the same as the display of the
instrument. The DAS should be calibrated at least once per year. Appendix G provides a simple approach
for calibration of the DAS.
In addition, gas analyzers typically have an option to set output voltages to full scale or to ramp the
analog output voltages supplied by the analyzer over the full output range. Such a function can be used to
check the analog recording process from the analyzer through the DAS.
Data Trail Audit- The data trail audit consists of following a value or values from the monitoring
instrument to the DAS, from the DAS to the local central computer, and then from the local central
computer to AQS. A person other than the normal station operator should perform this duty. A
procedure similar to the following should be conducted:
A data value(s) should be collected from the monitor (usually an hourly value or another
aggregated value reported to AQS) and be compared to the data stored in the DAS for the same
time period. Also, if strip chart recorders are used, a random number of hourly values should be
compared to the data collected by the DAS. This audit should be completed on a regular defined
frequency and for every pollutant reported.
From the central computer, the auditor checks to see if this hourly value is the same.
The above actions should be completed well in advance of data submittal to AQS. If the data has been
submitted to AQS, then the AQS data base should be checked and modified as necessary per the
appropriate AQS procedures.
Whether a monitoring organization is transferring the data from an instrument via an on-site DAS or
transferring the data digitally, the data trail audit should be performed on a routine basis.
Initialization Errors
All data acquisition systems must be initialized. The initialization consists of an operator setting up the
parameters so that the voltages produced by the instruments can be read, scaled correctly and reported in
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 7 of 15
the correct units. Errors in initializations can create problems when the data are collected and reported.
Read the analyzer manufacturers literature before parameters are collected. If the manufacturer does not
state how these parameters are collected, request this information. The following should be performed
when setting up the initializations:
Check the full scale outputs of each parameter.
Calibrations should be followed after each initialization (each channel of a DAS should be
calibrated independently). Appendix G provides an example of a DAS calibration technique.
Review the instantaneous data stream, if possible, to see if the DAS is collecting the data
correctly.
Save the initializations to a storage medium; if the DAS does not have this capability, print out
the initialization and store it at the central computer location and at the monitoring location.
Check to see if the flagging routines are performed correctly; data that are collected during
calibrations and down time should be flagged correctly.
Check the DAS for excessive noise (variability in signal). Noisy data that are outside of the
normal background are a concern. Noisy data can be caused by improperly connected leads to
the multiplexer, noisy AC power, or a bad multiplexer. Refer to the owners manual for help on
noisy data.
Check to see that the average times are correct. Some DAS consider 45 minutes to be a valid
hour, while others consider 48 minutes. Agency guidelines should be referred to before setting
up averaging times.
14.1.4 Data Logger to Database
Once data are on the data logger at the ambient air monitoring station, they need to be sent to servers
where they can be summarized and disseminated to data users. In most cases this will occur by using a
server at the office of the monitoring organization. The conventional way to get data from the monitoring
stations has been to poll each of the stations individually. With more widespread availability of the
internet, pushing data from monitoring sites on a regular basis will be especially effective in mapping and
public reporting of data. Note, in some cases it is possible to report data directly from a monitor to a
database without the use of a station data logger. This solution is acceptable so long as the monitor is
capable of data storage for periods when telemetry is off-line.
Data transfer is usually accomplished in three ways: hard copy printout, downloading data from internal
storage medium to external storage medium, or digital transfer via the telephonelines, internet, satellite or
other advanced means of communication. Due to the desire for real time data for the Air Quality Index
(AQI) and other related needs, monitoring organizations should plan to upgrade to digital data acquisition
and communication systems.
Hard copy report- With the advent of sophisticated DAS networks and data backup systems, hard copy
reports are being generated less frequently. Therefore if hard copy reports are not being used it is
strongly recommended that monitoring organizations create an electronic back-up of their data on a
defined and frequent schedule. The frequency of the back-ups and any other associated information
should be reflected in their Quality Assurance Project Plan (QAPP) and Standard Operating Procedures
(SOP). However, for some smaller monitoring networks hard copy reports have some advantages:
they can be reviewed by the station operators prior to and/or during site visits to ascertain the
quality of the data;
they can be compared against the historical data stored on the DAS at the site for validation;
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 8 of 15
notes can be made on the hard copy reports for later review by data review staff; and
they create a back-up to the electronically based data.
External Storage- This term refers to storing and transferring the data on external media such as
diskettes, flash drives, or CD-ROMs. Many new generation DAS are computer platforms that contain
ports these storage devices. If remote access via telephone is not an option, then data can be hand
transferred to a central office for downloading and data review. This is usually the method used to transfer
data from manual methods.
Digital Transfer- All new generation DAS allow access to the computer via the telephone and modem.
These systems allow fast and effective ways to download data to a central location. The EPA
recommends using these systems for the following reasons:
in case of malfunction of an ambient instrument, the appropriate staff at the central location can
begin to diagnose problems and decide a course of action;
down loading the data allows the station operators, data processing team, and/or data validators to
get a head start on reviewing the data; and
when pollution levels are high or forecasted to be high, digital transfer allows the pollution
forecaster the ability to remotely check trends and ensure proper operation of instruments prior to
and during an event.
NOTE: In any of these systems it is necessary to plan for some type of system back-up in case of
unexpected crashes in order to reduce and minimize data loss.
14.1.5 Manual Data Acquisition
Most of this section has been devoted to the collection of data through automated DAS. In some
ways, once the DAS is properly set up and checked, the systems are reliable, can be checked
remotely and are easier to manage then manual data acquisition. Recovery and collection of data
from manual samplers in some ways can more complicated because it includes the retrieval of
not only samples, which may include the use of hand entered data sheets and chain of custody
forms, but electronic sampler information downloaded to USB flash drives, or portable laptops
for data transfer to central offices. The process is further complicated by weather conditions and
sample shipping to remote laboratories where additional logging of samples and data take place.
Monitoring organization should identify all critical information necessary for a sampling activity,
and have standard operating procedures for the procedures necessary to collect all important
information pertaining to the sample. As soon as possible, any hand entered information should
be recorded electronically. Samplers have some storage capacity so it is suggested that no data
be cleared off the samplers until it is confirmed that the sampler data has been downloaded and
stored in the central office data base. Once stored electronically, the management of this
information should follow the same procedures as those for automated data retrievals.
14.2 Data Transfer Public Reporting
The area of public reporting for air monitoring data may provide the largest number of users of data. For
public reporting of the AQI, the AIRNow web site will remain the EPAs primary medium for
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 9 of 15
distribution of near real- time air monitoring data. The additional continuous monitoring parameters
collected from NCore will also be reported to AIRNow. These parameters are expected to be made
publicly available for sharing throughout technical user communities. However, they are not expected to
be widely distributed through AIRNow as products for public consumption.
This section will discuss the transfer of data from the monitoring organization to two major data
repositories: 1) AIRNow for near real-time reporting of monitoring data, and 2) AQS for long term
storage of validated data.
14.2.1 Real-time Data Reporting to AIRNow
One of the most important emerging uses of ambient monitoring data has been public reporting of the Air
Quality Index (AQI). This effort has expanded on EPAs AIRNow web site from regionally-based near
real-time ozone mapping products color coded to the AQI, to a national multi-pollutant mapping,
forecasting, and data handling system of real-time data. Since ozone and PM
2.5
drive the highest
reporting of the AQI in most areas, these two pollutants are the only two parameters currently publicly
reported from AIRNow. This program allows for short term non-validated data to be collected by a
centrally located computer that displays the data in near real time data formats such as tables and contour
maps.
While other pollutants such as CO, SO
2
, NO
2
, and PM
10
may not drive the AQI, they are still important
for forecasters and other data users to understand for model evaluation and tracking of air pollution
episodes. Therefore, for the NCore sites, the goal is the report all gaseous CO, SO
2
, NO and NOy data
and base meteorological measurements to AIRNow
Reporting Intervals
Currently, hourly averages are the reporting interval for continuous particulate and gaseous data. These
are the reporting intervals for both AQS (AQS supports a variety of reporting intervals) and AIRNow for
AQI purposes. These reporting intervals will meet most of the multiple objectives of NCore for
supporting health effects studies, AQI reporting, trends, NAAQS attainment decisions, and accountability
of control strategies. However, with these objectives also comes the desire for data at finer time
resolutions: 5 minute averages for gaseous pollutants and sub-hourly averages for certain particulate
matter monitors. Examples of this need for finer time resolution of data include, but are not limited to:
tracking air pollution episodes, providing data for exposure studies, model evaluation, and evaluating
shorter averaging periods for potential changes to the NAAQS. Monitoring organizations generally have
the hardware and software necessary to log and report this data. The challenge to obtaining and reporting
the data is the current communication packages used, such as conventional telephone modem polling. One
widely available solution to this would be the use of internet connectivity; allowing data at individual
monitoring sites to be pushed to a central server rather than being polled. Monitoring organizations
should begin to investigate the possibilities of using this media.
With the generation/reporting of data at shorter averaging intervals, the challenge becomes validation of
all the data. The historical perception has been that each criteria pollutant measurement needs to be
verified and validated manually. With the amount of data generated, this would be a time-consuming task.
To provide a nationally consistent approach for the reporting interval of data, the NCore networks will
take a tiered approach to data reporting. At the top tier, hourly data intervals will remain the standard for
data reporting. Long term, the NCore networks will be capable of providing at least 5 minute intervals for
those methods that have acceptable data quality at those averaging periods. For QA/QC purposes such as
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 10 of 15
zero/span and one-point QC, monitoring organizations should be capable of assessing data on at least a 1-
minute interval.
With instantaneous data going to external websites, monitoring organizations operating their own
websites containing the same local and/or regional data should add a statement about the quality of data
being displayed at the site. This cautionary statement will notify the public that posted data has not been
fully quality assured and discrepancies may occur. For an example, the AIRNow Website makes the
statement
Although some preliminary data quality assessments are performed, the data as such are
not fully verified and validated through the quality assurance procedures monitoring
organizations use to officially submit and certify data on the EPA AQS(Air Quality
System). Therefore, data are used on the AIRNow Web site only for the purpose of
reporting the AQI. Information on the AIRNow web site is not used to formulate or
support regulation, guidance or any other Agency decision or position.
14.2.2 Reporting Frequency and Lag Time for Reporting Data to AIRNow
Continuous monitoring data that are being provided to AIRNow in near real-time are to be reported each
hour. Data should be reported as soon as practical after the end of each hour. For the near term, the goal
is to report data within twenty minutes past the end of each hour. This will provide enough time for data
processing and additional data validation at the AIRNow Data Management Center (DMC), generation of
reports and maps, distribution of those products to a variety of stakeholders and web sites, and staff
review before the end of the hour. This is an important goal to support reporting of air pollution episodes
on news media programs by the top of the hour. The long term goal for NCore sites is to report all data
within five minutes after the end of an hour.
14.3 Data Transfer-Reporting to External Data Bases
Today, the need for the ambient air monitoring data reaches outside the monitoring community. In
addition to the traditional needs of the data, determination of NAAQS compliance and the daily AQI
report, a health researcher or modeler may want a very detailed accounting of the available data in the
shortest time intervals possible. Atmospheric scientists typically desire data in a relatively unprocessed
yet comprehensive form with adequate descriptions (meta data) to allow for further processing for
comparability to other data sets. These needs increase the demands for the data and require multiple
reports of the information.
14.3.1 AQS Reporting
All ambient air monitoring data will eventually be transferred and stored in AQS. As stated in 40 CFR
Part 58.16
9
, the monitoring organization shall report all ambient air monitoring and associated quality
assurance data and information specified by the AQS Users Guide into the AQS format. The data are to
be submitted electronically and on a specified quarterly basis. Since changes in reporting requirements
occur, monitoring organizations should review CFR for the specifics of this requirement.
9
http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title40/40tab_02.tpl
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 11 of 15
The AQS manuals are located at the AQS Website
10
. The AQS Data Coding Manual replaces the
previous Volume II and provides coding instructions, edits performed, and system error messages. The
AQS User Guide replaces the former Volume III and describes the procedures for data entry. Both
manuals will be updated as needed and the new versions will be available at the web site. Table 14-1
provides the units and the number of decimal places that, at a minimum, are required for reporting to AQS
for the criteria pollutants. These decimal places are used for comparison to the NAAQS and are displayed
in AQS summary reports. However, monitoring organizations can report data up to 5 values to the right
of the decimal (beyond five AQS will truncate). Within the five values to the right of the decimal place,
for NAAQS comparison purposes, AQS will reduce the data as indicated in the last column of Table 14-1.
Table 14-1 AQS Data Reporting Requirements
Pollutant Units Decimal
Places
Example Minimum reporting requirement
(as described in 40 CFR Part 50)
PM
2.5
g/m
3
1 10.2 shall be reported to AQS in micrograms per cubic meter (g/m
3
) to one
decimal place, with additional digits to the right being truncated (App.
N)
PM
10
g/m
3
1 26.2 No description found
Lead (Pb)
TSP and
PB-PM
10
g/m
3
3 1.525 Pb-TSP and Pb-PM
10
measurement data are reported to AQS in units of
micrograms per cubic meter (g/m
3
) at local conditions (local
temperature and pressure, LC) to three decimal places; any additional
digits to the right of the third decimal place are truncated (App. R).
O
3
ppm 3 0.108 Hourly average concentrations shall be reported in parts per million
(ppm) to the third decimal place, with additional digits to the right of the
third decimal place truncated (App. P).
SO
2
ppb 1 35.1 reported to AQS in units of parts per billion (ppb), to at most one place
after the decimal, with additional digits to the right being truncated with
no further rounding (App. T)
NO
2
ppb 1 53.2 reported to AQS in units of parts per billion (ppb), to at most one place
after the decimal, with additional digits to the right being truncated with
no further rounding (App. S)
CO ppm 1 2.5 No description found
PM
10-2.5
g/m
3
1 10.2 No description found follow PM2.5 requirements
14.3.2 Standard Format for Reporting to AQS
AQS allows flexibility in reporting formats. The formats previously used by AQS can be used for raw
data (hourly, daily, or composite) and for reporting precision and bias data. The system also has new
report formats for this data as well as formats for registering new sites and monitors. These new formats
are defined in the AQS Data Coding Manual. Work is also in progress to define an Extensible Markup
Language (XML) schema for AQS reporting. Use of XML as a data format is consistent with EPA and
Federal guidelines towards better data integration and sharing.
14.3.3 Important AQS Agency Roles
Some fields in AQS are key to identifying the agency or organization responsible for certain aspects of
monitoring. Due to the fact that State agencies may play some overarching roles (such as reporting data or
responsible for QA aspects as a PQAO) yet not be responsible for the monitoring of some sites (e.g., local
organizations or Tribe), it is important to understand, identify and use these roles correctly. Table 14-2
identifies the agency roles for AQS reporting.
10
http://www.epa.gov/ttn/airs/airsaqs/manuals/
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 12 of 15
Table 14-2 AQS Agency Roles
Role Name Definition Relationship to a monitor Comments
Monitoring
Organization (MO)
Organization responsible for
operation of the monitoring network
Each monitor can only be
associated with one MO at any
particular date time
QMPs must be related to the
MO
Primary Quality
Assurance
Organization
(PQAO)
Agency responsible for operation of
the monitoring network
Each monitor can only be
associated with one PQAO at
any particular date time
There can be multiple MOs in
a PQAO and it can be pollutant
specific. QAPPs must be
related to MO and PQAO
Submitting
Organization (SO)
Organization submitting the data to
AQS
None Data for particular monitors
could be submitted by multiple
organizations for example, field
data by the MO and analytical
data for an analyzing agency
Analyzing Agency Organization performing the
analysis on samples
None
Collecting
Organization
Organization responsible for
collecting data or maintaining
monitor
None In some cases the MO may
contract out monitoring
activities
It must be mentioned that at a minimum for any raw data submittal MO, PQAO and SO must be entered.
In many cases they may be the same organization; in other cases they may not.
14.3.4 Expanded QA Information to be Reported to AQS
Since the last revision to this Handbook, the process of reporting QA data to AQS has been improved.
New QA transactions have been developed that support the reporting of additional quality control data
that do not need to be fit into either an accuracy or precision transaction. Many of the transactions will
be optional for use (i.e., duplicates, replicates, audit of data quality) depending on the monitoring
program. However, QA transactions will be required for entry of the traditional Appendix A QC data as
well as pertinent information for quality management plans (QMPs), quality assurance project plans
(QAPPs) and Regional Office technical systems audits (TSAs) since they are a requirement for receiving
grant funds (QMP/QAPPs) and are included in the 40 CFR Part 58 App A.
PQAO and MO Relationships Relative to QMPs and QAPPs
QMPs--
The following fields will be required for QMP reporting:
1. Submitting Agency Code MO code
2. Submission Date- Date QMP submitted to EPA, helps with tracking approval process
3. Approving Agency Code Code for EPA Region
4. Approval Date EPA Approval Date
5. QMP Status Code-
6. Comments- free form comments
A MO meeting the definition above and receiving STAG funds must have a QMP approved by EPA. In
most cases, the QMP is an overarching document that covers all the pollutants measured by the MO and
is separate from the QAPP. In this case, the submitting agency code should be the MO associated with
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 13 of 15
the QMP.
For smaller organizations (e.g., tribes and small local MOs), EPA has allowed for consolidation of QMP
and QAPPs. In this case, even though it is one document, the MO should report a submission and
approval date for the QMP and the same date for the QAPP as a separate submission (see QAPP
information below).
For some pollutants there may be a number of local monitoring organizations (MOs) that have
consolidated to form a single PQAO. In this instance, there may be a possibility that a single QMP, or a
consolidated QMP/QAPP is developed. However, even in this case, each distinguishable MO should
report a submission and approval date for the QMP and the same date for the QAPP.
QAPPs--
The following fields will be required for QAPP reporting:
1. Submitting Agency Code - MO code
2. PQAO Code- PQAO Code (may be the same as submitting agency but may not)
3. Parameter classification- Identifies the individual pollutants or the network (CSN, NATTS) for
which the QAPP is developed.
4. Submission Date- Date QAPP submitted to EPA, helps with tracking approval process
5. Approving Agency Code May be EPA or submitting agency
6. Approval Date- Date QAPP approved by EPA or submitting agency
7. QAPP Status Code- code identify at what stage of review/approval the QAPP is in
8. Comments- free form comments
Since a MO can consolidate to larger PQAOs for a pollutant
11
, there is a possibility that a QAPP can be
submitted by a MO even though it references its association to a larger PQAO, or a QAPP can be
developed by the PQAO that is utilized by all or some of the MOs within the PQAO. In order to
determine this for each MO, the PQAO must also be reported. Therefore, each MO as defined in Table
14-2 must report QAPP data for any parameter or parameter classification. Since a MO may be
consolidated into a PQAO for one pollutant and not another, for the criteria pollutants the QAPP reporting
process will be on the pollutant level. For monitoring networks like NATTS or CSN, the information can
be submitted at the network identifier level.
14.3.5 Annual Certification of Data
The annual data certification is also stored in AQS. The monitoring organization is required to certify the
data (by formal letter) for a calendar year (J an 1-Dec 31) by May 1 of the following year. See 40 CFR
Part 58.15 for details since this time period can change. This certification requires the monitoring
organization to review the air quality data and quality assurance data for completeness and validity and to
submit a certification letter and accompanying data certification reports to the Regional Office. The
certification letter and accompanying reports are reviewed and if the results of the review are consistent
with the criteria for certification, the certification flag is set in the AQS database. After certification is
complete, any updates to the data will cause the critical review process to identify that the certified data
has been changed and the certification flag will be dropped. In 2013, EPA developed an automated
11
With the introduction of PQAOs in CFR in 2006, some local monitoring organizations consolidated to a larger
PQAO for PM2.5 monitoring
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 14 of 15
certification process and allows the EPA Regions to certify monitoring organization data. At the time of
publication of this document the process was still under development.
14.3.4 Summary of Desired Performance for Information Transfer Systems
To define the needed performance criteria of a state-of-the art information technology system, a table of
needs has been developed. This table provides performance needs for an optimal information technology
system, but is not intended to address what the individual components should look like. For instance,
once low level validated data for a specific time period are ready to leave the monitoring station, a
number of telemetry systems may actually accomplish moving those data. By identifying the needed
performance criteria of moving data, rather than the actual system to move it, monitoring organizations
may be free to identify the most optimal system for their network. Table 14-3 summarizes the
performance elements of the data management systems used to log, transfer, validate, and report data
from NCore ambient air monitoring stations.
Table 14-3 NCore (Level 2 and 3) Information Technology Performance Needs
12
Performance Element Performance Criteria Notes
Sample Periods 5 minutes (long termgoal), and 1 hour data (current
standard)
5 minutes and 1 hour data to support exposure,
mapping and modeling. 1 hour data for Air Quality
Index reporting and NAAQS. Sample period may need
to be higher for certain pollutant measurement systems
depending on method sample period and measurement
precision when averaging small time periods.
Data Delivery Near Termgoal - Within 20 minutes nationally each
hour
Long termgoal - Within 5 minutes nationally each
hour
As monitoring organizations migrate to new telemetry
systems the goal will be to report data within 5
minutes. This should be easily obtained with
broadband pushing of data to a central server.
Low Level Validation
- Last automated zero and QC check acceptable
- Range check acceptable
- Shelter parameters acceptable
-Instrument parameters acceptable
Other validation should be applied as available:
- site to site checks
- rate of change
-lack of change.
Data Availability
- all QC data, operator notes, calibrations, and
pollutant data within network
- Low level validated pollutant data externally
Create log of all monitoring related activities internally.
Allow only validated data to leave monitoring
organization network.
Types of monitoring data
to disseminate-externally
-continuous and semi-continuous pollutant data -
accompanying meteorological data
Associated manual method supporting data (for
instance FRM ambient Temperature) should be
collected but not reported externally.
Additional data for
internal tracking
Status of ancillary equipment such as shelter
temperature, power surges, zero air system,
calibration system
Relevant site information Latitude, longitude, altitude, land use category, scale
of representativeness, pictures and map of area
Other site information may be necessary.
Remote calibration Ability to initiate automated calibrations on regular
schedule or as needed
Reviewing calibration - allow for 1 minute data as part of electronic
calibration log
Initialization of manual
collection method
Need to be able to remotely initiate these or have
themset at an action level froma specific monitor
Reporting Format
Short Term- Maintain Obs file format and pipe
delimited formats for AIRNow and AQS reporting,
respectively
Near Term-XML
Need to coordinate development of XML schema with
multiple stakeholders. XML is an open format that will
be able to be read by most applications.
12
See NCore Technical Assistance Document Version 4 http://www.epa.gov/ttn/amtic/ncore/guidance.html
QA Handbook Vol II, Section 14.0
Revision: 0
Date: 05/13
Page 15 of 15
14.4 Data Management
Managing the data collected is just as important as correctly collecting the data. The amount of data
collected will continue to grow based on the needs of the data users. Previous sections have confirmed
this statement, providing a glimpse of the potential data users and the uses. Generally, data are to be
retained for a period of 3 years from the date the grantee submits its final expenditure report unless
otherwise noted in the funding agreement
13
. With electronic records and electronic media, this
information can be stored and managed with less use of space than with the conventional paper records.
However, even with todays technology there will be some paper records and those need to be managed in
an orderly manner. The manner in which a monitoring organization manages its data is documented in its
QMP and QAPP.
Management is marshaling scarce resources to accomplish goals. Challenges imposed by the need to
capture increasing volumes of data and to make that data available to the public and other groups in
various formats and in short timeframes require a strategy for obtaining enough of these resources:
computer processing capacity,
data storage, archival storage, paper file storage,
floor space, and
support staff; for deployment among central offices, local offices, and monitoring sites to capture
data having the quality characteristics listed in 14.0.
Air monitoring organization managers may want to seek the assistance of their organizations IT staff
and/or hardware/software maintenance contractors. Managers may find it helpful to consult these
references:
EPAs records management webpage
14
section 5 of this document
Good Automated Laboratory Practices, posted on the OEI website.
This information should be reviewed not only by those in a monitoring organization responsible for
overall data management but also by the monitoring organizations Systems or Network Administrator.
The latter person(s) can provide helpful information in designing the overall data management system
according to todays industry standards. Remember, the data has to be of known quality, reliable and
defensible. In order for monitoring organizations to continue to meet those objectives, many sources of
information need to be reviewed.
13
40 CFR Part 31.42
14
http://www.epa.gov/records/
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 1 of 14
15.0 Assessment and Corrective Action
An assessment is an evaluation process used to measure the performance or effectiveness of a system and
its elements. It is an all-inclusive term used to denote any of the following: audit, performance
evaluation, management systems review, peer review, inspection and surveillance. For the Ambient Air
Quality Monitoring Program, the following assessments will be discussed: network reviews, performance
evaluations, technical systems audits and data quality assessments.
15.1 Network Reviews
As described in 40 CFR 58.10
1
, Beginning July 1,2007, the State, or where applicable, local
monitoring organizations shall adopt and submit to the Regional Administrator an annual monitoring
network plan (40 CFR 58.10) which shall provide for the establishment and maintenance of an air
quality surveillance system that consists of a network of SLAMS monitoring stations including FRM,
FEM, and ARM monitors that are part of SLAMS, NCore stations, CSN stations, State speciation stations,
SPM stations, and/or, in serious, severe and extreme ozone nonattainment areas, PAMS stations, and
SPM stations. The plan shall include a statement of purposes for each monitor and evidence that siting
and operation of each monitor meets the requirements of appendices A, C, D, and E of Part 58, where
applicable. The annual monitoring network plan must be made available for public inspection for at least
30 days prior to submission to EPA. The AMTIC Website has a page
2
devoted to the progress and
adherence to this requirement. This page contains links to State and local ambient air monitoring plans.
In addition to an annual network plan, starting in 2010, the State, or where applicable, local monitoring
organization shall perform and submit to the EPA Regional Administrator an assessment of the air quality
surveillance system every 5 years to determine, at a minimum, if the network meets the monitoring
objectives defined in 40 CFR Part 58, Appendix D, whether new sites are needed, whether existing sites
are no longer needed and can be terminated, and whether new technologies are appropriate for
incorporation into the ambient air monitoring network. The network assessment must consider the ability
of existing and proposed sites to support air quality characterization for areas with relatively high
populations of susceptible individuals (e.g., children with asthma), and for any sites that are being
proposed for discontinuance, the effect on data users other than the monitoring organization itself, such as
nearby States and Tribes or health effects studies. For PM
2.5
, the assessment also must identify needed
changes to population-oriented sites. The state, or where applicable, local monitoring organization must
submit a copy of this 5-year assessment, along with a revised annual network plan, to the Regional
Administrator.
In order to maintain consistency in implementing and collecting information from a network review, EPA
has developed the document Ambient Air Monitoring Network Assessment Guidance
3
. The information
presented in this section provides some excerpts from this guidance document.
15.1.1 Network Selection
Due to the resource-intensive nature of network reviews, it may be necessary to prioritize monitoring
organizations and/or pollutants to be reviewed. The following criteria may be used to select networks:
1
http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title40/40tab_02.tpl
2
http://www.epa.gov/ttn/amtic/plans.html
3
http://www.epa.gov/ttn/amtic/cpreldoc.html
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 2 of 14
date of last review;
areas where attainment/nonattainment designations are taking place or are likely to take place;
results of special studies, saturation sampling, point source oriented ambient monitoring, etc.; and
monitoring organizations which have proposed network modifications since the last network
review.
In addition, pollutant-specific priorities may be considered (e.g., newly designated ozone nonattainment
areas, PM
10
"problem areas", etc.). Once the monitoring organizations have been selected for review,
significant data and information pertaining to the review should be compiled and evaluated. Such
information might include the following:
network files for the selected monitoring organization (including updated site information and site
photographs);
AQS reports (AMP220, 225, 255, 380, 390, 450);
air quality summaries for the past five years for the monitors in the network;
emissions trends reports for major metropolitan areas;
emission information, such as emission density maps for the region in which the monitor is
located and emission maps showing the major sources of emissions; and
National Weather Service summaries for monitoring network area.
Upon receiving the information, it should be checked for consistency and to ensure it is the latest
revision. Discrepancies should be noted on the checklist (Appendix H) and resolved with the monitoring
organization during the review. Files and/or photographs that need to be updated should also be
identified.
15.1.2 Conformance to 40 CFR Part 58 Appendix D- Network Design Requirements
With regard to 40 CFR Part 58 Appendix D requirements, the network reviewer must determine the
adequacy of the network in terms of number and location of monitors: specifically, (1) is the monitoring
organization meeting the number of monitors required by the design criteria requirements; and (2) are the
monitors properly located, based on the monitoring objectives and spatial scales of representativeness?
Number of Monitors
For SLAMS, NCore and PAMs the minimum number of monitors required is specified in the regulations.
As revisions occur to the NAAQS, there is a possibility the number of required monitors will also change
so the reader should keep abreast of the changes that can occur in Appendix D. Adequacy of the network
may be determined by using a variety of tools, including the following:
maps of historical monitoring data;
maps of emission densities;
dispersion modeling;
special studies/saturation sampling;
SIP requirements;
revised monitoring strategies (e.g., lead strategy, reengineering air monitoring network); and
best professional judgment.
Location of Monitors
Appendix D does provide general description of the location of sites needed for NAAQS related
monitoring. The EPA Regional Office and monitoring organizations work together to identify the best
location for the monitors based upon the siting /location requirement defined in Appendix D. Adequacy
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 3 of 14
of the location of monitors can only be determined on the basis of stated objectives. Maps, graphical
overlays, and GIS-based information can be extremely helpful in visualizing or assessing the adequacy of
monitor locations. Plots of potential emissions and/or historical monitoring data versus monitor locations
are especially useful.
For PAMS, there is considerable flexibility when locating each site within a nonattainment area or
transport region. The three fundamental criteria which need to be considered when locating a final PAMS
site are: (1) sector analysis - the site needs to be located in the appropriate downwind (or upwind) sector
(approximately 45
o
) using appropriate wind directions; (2) distance - the sites should be located at
distances appropriate to obtain a representative sample of the areas precursor emissions and represent the
appropriate monitoring scale; and (3) proximate sources.
15.1.3 Conformance to 40 CFR Part 58, Appendix E - Probe Siting Requirements
Applicable siting criteria for SLAMS, NCore, and PAMS are specified in 40 CFR Part 58, Appendix E.
The on-site visit itself consists of the physical measurements and observations needed to determine
compliance with the Appendix E requirements, such as height above ground level, distance from trees,
paved or vegetative ground cover, etc. Prior to the site visit, the reviewer should obtain and review the
following:
most recent hard copy of site description (including any photographs)
data on the seasons with the greatest potential for high concentrations for specified pollutants
predominant wind direction by season
The checklist provided in Appendix H of this Handbook is also intended to assist the reviewer in
determining conformance with Appendix E. In addition to the items on the checklist, the reviewer should
also do the following:
ensure that the manifold and inlet probes are clean
estimate probe and manifold inside diameters and lengths
inspect the shelter for weather leaks, safety, and security
check equipment for missing parts, frayed cords, etc.
check that monitor exhausts are not likely to be introduced back to the inlet
record findings in field notebook and/or checklist
take photographs/video in the 8 cardinal directions
document site conditions, with additional photographs/video
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 4 of 14
15.1.4 Checklists and Other Discussion Topics
Checklists are provided in Appendix H to assist network reviewers (SLAMS and PAMS) in conducting
the review. In addition to the items included in the checklists, other subjects for possible discussion as
part of the network review and overall adequacy of the monitoring program include:
installation of new monitors;
relocation of existing monitors;
siting criteria problems and suggested solutions;
problems with data submittals and data completeness;
maintenance and replacement of existing monitors and related equipment;
quality assurance problems;
air quality studies and special monitoring programs; and
other issues (proposed regulations/funding).
15.1.5 Summary of Findings
Upon completion of the network review, a written network evaluation should be prepared. The
evaluation should include any deficiencies identified in the review, corrective actions needed to address
the deficiencies, and a schedule for implementing the corrective actions. The kinds of
discrepancies/deficiencies to be identified in the evaluation include discrepancies between the monitoring
organization network description and the AQS network description; and deficiencies in the number,
location, and/or type of monitors.
15.2 Performance Evaluations
Performance evaluations (PEs) are a type of audit in which the
quantitative data generated in a measurement system are obtained
independently and compared with routinely obtained data to
evaluate the proficiency of an analyst, or a laboratory
4
. EPA also
uses it to evaluate instrument performance. The National
Performance Evaluation Programs:
Allow one to determine data comparability and usability
across sites, monitoring networks (Tribes, States, and
geographic regions), instruments and laboratories.
Provide a level of confidence that monitoring systems are
operating within an acceptable level of data quality so data
users can make decisions with acceptable levels of certainty.
Help verify the precision and bias estimates performed by
monitoring organizations.
Identify where improvements (technology/training) are
needed.
4
American National Standard-Quality Systems for Environmental Data and Technology Programs-Requirements
with Guidance for Use (ANSI/ASQC E4-2004)
PEP Audit
NPAP through the probe audit
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 5 of 14
Assure the public of non-biased assessments of data quality.
Provide a quantitative mechanism to defend the quality of data.
Provide information to monitoring organizations on how they compare with the rest of the nation,
in relation to the acceptance limits and to assist in corrective actions and/or data improvements.
Some type of national PE program is implemented for all of the ambient air monitoring activities. Table
15-1 provides more information on these activities. It is important that these performance evaluations be
independent in order to ensure they are non-biased and objective. With the passage of the Data Quality
Act
5
, there is potential for EPA to receive challenges to the quality of the ambient air data. Independent
audits help provide another piece of objective evidence on the quality of a monitoring organizations data
and can help EPA defend the quality of the data.
Table 15-1 National Performance Evaluation Activities
6
Performed by EPA
Program/
Lead Agency
Explanation
NPAP
OAQPS
National Performance Audit Program provides audit standards for the gaseous pollutants either as devices that the site
operator connects to the back of the instrument or through the probe in which case the audits are conducted by
presenting audit gases through the probe inlet of ambient air monitoring stations. Flow audit devices and lead strips are
also provided through NPAP. NPAP audits are required at 20% of a primary quality assurance organizations sites each
year with a goal of auditing all sites in 5-7 years.
PM
2.5
PM
10-2.5
PEP
OAQPS
Performance Evaluation Program. The strategy is to collocate a portable FRM PM
2.5
or PM
10-2.5
air sampling audit
instrument with an established primary sampler at a routine air monitoring site, operate both samplers in the same
manner, and then compare the results. Each year five PEP audits are required for primary quality assurance
organizations (PQAOs) with less than or equal to 5 monitoring sites or eight audits are required for PQAOs with greater
than five sites. These audits are not required for PM
10
Pb-PEP Performance Evaluation Program. The strategy is to collocate a portable FRM Pb air sampling audit instrument with an
established primary sampler at a routine air monitoring site, operate both samplers in the same manner, and then
compare the results. Each year five PEP audits (1 PEP collocated sample and 4 samples from monitoring organizations
routine collocated instrument) are required for primary quality assurance organizations (PQAOs) with less than or equal
to 5 monitoring sites or eight audits are required for PQAOs with greater than five sites (2 PEP collocated samples and
6 samples from monitoring organizations routine collocated instrument).
NATTS PT
OAQPS
A National Air Toxics Trend Sites (NATTS) proficiency test (PT) is a type of assessment in which a sample, the
composition of which is unknown to the analyst, is provided to test whether the analyst/laboratory can produce
analytical results within the specified acceptance criteria. PTs for volatile organic carbons (VOCs), carbonyls and
metals are performed quarterly for the ~22 NATTS laboratories
SRP
EPA-RTP
The Standard Reference Photometer (SRP) Program provides a mechanism to establish traceability among the ozone
standards used by monitoring organizations with the National Institute of Standards and Technology (NIST). Every year
NIST certifies an EPA SRP. Upon certification, this SRP is shipped to the EPA Regions who use this SRP to certify the
SRP that remains stationary in the Regional Lab. These stationary SRPs are then used to certify the ozone transfer
standards that are used by the State, Local and Tribal monitoring organizations who bring their transfer standards to the
Regional SRP for certification.
PAMS Cylinder
Certs
EPA developed a system to certify the standards used by the monitoring organizations to calibrate their PAMS
analytical systems. The standards are sent to the EPA Office of Radiation and Indoor Air (ORIA-LV) who perform an
independent analysis/certification of the cylinders. This analysis is compared to the vendor concentrations to determine
if they are within the contractually required acceptance tolerance.
CSN/IMPROVE
Round Robins PTs
and Audits
ORIA-AL
PM
2.5
Chemical Speciation Network (CSN) and IMPROVE Round Robins are a type of performance evaluation where
the audit samples are developed in ambient air; therefore, the true concentration is unknown. The Office of Indoor Air
and Radiation (ORIA) in Montgomery, AL) implement these audits for the CSN/IMPROVE programs and for the PEP
weighing laboratories. The audit is performed by collecting samples over multiple days and from multiple samplers.
These representative samples are then characterized by the ORIA lab and sent to the routine sample laboratories for
analysis. Since the true concentrations are unknown, the reported concentrations are reviewed to determine general
agreement among the laboratories. In addition ORIA implements technical systems audits of IMPROVE and CSN
laboratories
Protocol Gas
OAQPS
EPA Protocol Gases are used in quality control activities (i.e., calibrations, audits etc.) to ensure the quality of data
derived from ambient air monitors used by every State in the country. EPA developed the Protocol Gas Program to
allow standards sold by specialty gas producers to be considered traceable to NIST standards. This program was
discontinued in 1998. In 2010 EPA established an Ambient Air-Protocol Gas Verification Program
7
that utilizes
volunteers from the ambient air monitoring community.
5
see www.eenews.net/Greenwire/Backissues/081604/08160403.htm
6
many of the National PEs can be found at the following website http://www.epa.gov/ttn/amtic/npepqa.html
7
http://www.epa.gov/ttn/amtic/aapgvp.html
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 6 of 14
Although Table 15-1 lists seven performance evaluation programs operating at the federal level, the
NPAP and PEP Programs will be discussed in more detail. Additional information on both programs can
be found on the AMTIC Website
8
. The October 17, 2006 monitoring rule identified the monitoring
organizations as responsible for ensuring the implementation of these audits
9
. Monitoring organizations
can either self-implement the program or continue to participate in the federally implemented program.
This choice is provided to the monitoring organization on an annual basis through a memo from OAQPS
through the EPA Regions. In order for monitoring organization to self-implement the program they must
meet criteria related to the adequacy of the audit (number of audits and how it is accomplished) as well as
meet independence requirements (see Figure 15.1).
15.2.1 National Performance Audit Program
10
Monitoring organizations operating SLAMS/PAMS/PSD are required to participate in the National
Performance Evaluation Programs by providing adequate and independent audits for its monitors as per
Section 2.4 of 40 CFR Part 58, Appendix A. One way of providing the audits is to participate in the
NPAP program either through self-implementation or federal implementation.
The NPAP is a cooperative effort among OAQPS, the 10 EPA Regional Offices, and the monitoring
organizations that operate the SLAMS/PAMS/PSD air pollution monitors. The NPAPs goal is to provide
audit materials and devices that will enable EPA to assess the proficiency of monitoring organizations
that are operating monitors in the SLAMS/PAMS/PSD networks. To accomplish this, the NPAP has
established acceptable limits or performance criteria, based on the data quality needs of the networks, for
each of the audit materials and devices used in the NPAP.
All audit devices and materials used in the NPAP are certified as to their true value, and that certification
is traceable to a National Institute of Standards and Technology (NIST) standard material or device
wherever possible. The audit materials used in the NPAP are as representative and comparable as
possible to the calibration materials and actual air samples used and/or collected in the
SLAMS/PAMS/PSD networks. The audit material/gas cylinder ranges used in the NPAP are specified in
the Federal Register.
Initially the NPAP system was a mailable system where standards and gasses were mailed to monitoring
organizations for implementation. In 2003, OAQPS started instituting a through the probe audit system
where mobile laboratories are sent to monitoring sites and audit gasses are delivered through the inlet
probe of the analyzers. The goal of the NPAP audit is:
Performing audits at 20 percent of monitoring sites per year, and 100% in 5-7 years.
Data submission to AQS within 3 months of audit.
Development of a delivery system that will allow for the audit concentration gasses to be
introduced to the probe inlet where logistically feasible.
Use of audit gases that are NIST certified and validated at least once a year for CO, SO
2
, and
NO
2
.
Validation/certification with the EPA NPAP program (if self-implementing) through collocated
auditing, at an acceptable number of sites each year. The comparison tests would have to be no
greater than 5 percent different from the EPA NPAP results.
8
http://www.epa.gov/ttn/amtic/npepqa.html
9
http://www.epa.gov/ttn/amtic/40cfr53.html-Final - Revisions to Ambient Air Monitoring Regulations.
10
http://www.epa.gov/ttn/amtic/npapgen.html
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 7 of 14
Incorporation of NPAP in the monitoring organizations quality assurance project plan (if self-
implementing).
The validation template in Appendix D list the acceptance limits of the NPAP audits.
NPAP Corrective Action
Since NPAP can only visit 20% of any monitoring organizations sites in a given year, the data is more
useful in providing EPA with a national assessment of data comparability across the criteria pollutant
network. However when individual sites fail an audit, EPA will attempt to work with the monitoring
agency to discover the reasons for the failure. Usually the failure is related to a site specific issue (e.g.,
leak) and not a network issue (e.g. bad calibration gas used to calibrate all monitors). If time is available,
the auditor can attempt to re-audit while at the site. If not the EPA Region and monitoring organization
can communicate on auditing the site at a later date. Unless the failure is related to an issue with NPAP
equipment, the original results will be reported along with any additional audit results after corrective
action.
15.2.2 PM
2.5
, PM
10-2.5
, and Pb Performance Evaluation Programs (PEP)
The Performance Evaluation Program
11
is a quality assurance activity which will be used to evaluate
measurement system bias of
the PM
2.5
, the PM
10-2.5
and the
Pb monitoring networks. The
pertinent regulations for this
performance audit are found
in 40 CFR Part 58, Appendix
A. The strategy is to
collocate a portable PEP
instrument with an
established routine air
monitoring site, operate both
monitors in exactly the same
manner and then compare the
results of this instrument
against the routine sampler at
the site. For primary quality
assurance organizations with
less than or equal to five
monitoring sites, five valid
performance evaluation audits
must be collected and
reported each year. For
primary quality assurance
organizations with greater
than five monitoring sites,
eight valid performance
evaluation audits must be
collected and reported each
11
http://www.epa.gov/ttn/amtic/pmpep.html
Organization
3rd Level
Supervision
Organization
2nd Level
Supervision
Organization
1st Level
Supervision
Organization
P ersonnel
QA Lab Analysis
Organization
1st Level
Supervision
Organization
P ersonnel
QA Field Sampling
Organization
2nd Level
Supervision
Organization
1st Level
Supervision
Organization
P ersonnel
Routine Lab Analysis
Organization
1st Level
Supervision
Organization
P ersonnel
Routine Field Sampling
Independent assessment - an assessment performed by a qualified individual, group, or
organization that is not part of the organization directly performing and accountable for
the work being assessed. This auditing organization must not be involved with the
generation of the routine ambient air monitoring data. An organization can conduct the
PEP if it can meet the above definition and has a management structure that, at a
minimum, will allow for the separation of its routine sampling personnel from its
auditing personnel by two levels of management, as illustrated in the figure below. In
addition, the pre and post weighing of audit filters must be performed by separate
laboratory facility using separate laboratory equipment. Field and laboratory personnel
would be required to meet the FRM Performance Audit field and laboratory training and
certification requirements. The State and local organizations are also asked to consider
participating in the centralized field and laboratory standards certification process.
Figure 15.1 Definition of independent assessment
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 8 of 14
year. A valid performance evaluation audit means that both the primary monitor and PEP audit
concentrations are valid and above 3 g/m
3
.
The Pb-PEP operates somewhat differently then the PM
2.5
, the PM
10-2.5
PEP in that it includes a
combination of independent audits and data obtained form the monitoring organization collocated Pb
sampler that is then sent to the National PEP Laboratory. More details on this process, including all
documentation including the PEP Implementation Plan, QAPP, Field and Laboratory SOPs, and reports
for each PEP can be found on the AMTIC Bulletin Board at the PEP Website
12
.
12
http://www.epa.gov/ttn/amtic/pmpep.html
Since performance evaluations are independent assessments, Figure 15.1 was developed to define
independence for the FRM performance evaluation to allow monitoring organizations to self-implement
this activity.
Since the regulations define the performance evaluations as an NPAP like activity, EPA has made
arrangements to implement this audit. Monitoring organizations can determine, on a yearly basis, to
utilize federal implementation by directing their appropriate percentage of grant resources back to the
OAQPS or implement the audit themselves. The following activities will be established for federal PEP
implementation:
field personnel assigned to each EPA Region, the hours based upon the number of required audits
in the Region; and
one national laboratory in Region 4 will serve as a national weighing lab and will include data
submittal to AQS.
PEP Corrective Action
Unlike the NPAP, which can provide immediate feedback on results, the PEP results are not available
until the monitoring organizations have reported their results (data from the routine monitor) to AQS.
This process can take at least 3 months but sometimes longer. Therefore, feedback cannot be immediate
and so the PEP has limited use (compared to NPAP) for corrective action at a monitoring organization
level. However, over the years that the PEP has been implemented, EPA has been able to identify bias at
the PQAO level as well as national levels among method designations. The PEP then helps to inform
those monitoring organizations that may be outside the DQOs or the norm or that have method
designations that may need some corrective action.
15.2.3 State and Local Organization Performance Audits
Any of the performance evaluation activities mentioned in this section can be performed internally by the
monitoring organizations. If the monitoring organization intends to self-implement NPAP or PEP then
they will be required to meet the adequacy and independence criteria mentioned in earlier sections. Since
a monitoring organization may want more audits then can be supplied by the NPAP and PEP, it may
decide to augment the federally implemented programs with additional performance audits. These
audits can be tailored to the needs of the monitoring organization and do not necessarily need to follow
NPAP and PEP adequacy and independence requirements. Some information on the procedures for this
audit can be found in Appendix H.
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 9 of 14
15.3 Technical Systems Audits
A technical systems audit is an on-site review and inspection of a monitoring organizations ambient air
monitoring program to assess its compliance with established regulations governing the collection,
analysis, validation, and reporting of ambient air quality data. A systems audit of each monitoring
organization within an EPA Region is performed every three years by a member of the Regional Quality
Assurance (QA) staff. Detailed discussions of the audits performed by the EPA and the monitoring
organizations are found in Appendix H; the information presented in this section provides general
guidance for conducting technical systems audits. A systems audit should consist of three separate
phases:
Pre-audit activities.
On-site audit activities.
Post-audit activities.
Summary activity flow diagrams have been
included as Figures 15.2, 15.3 and 15.5
respectively. The reader may find it useful to
refer to these diagrams while reading this
guidance.
15.3.1 Pre-Audit Activities
At the beginning of each fiscal year, the
audit lead or a designated member of the
audit team should establish a tentative
schedule for on-site systems audits of the
monitoring organizations within their
Region. It is suggested that the audit lead
develop an audit plan. This plan should
address the elements listed in Table 15-2.
The audit plan is not a major undertaking
and in most cases will be a one page table or
report. However, the document represents
thoughtful and conscious planning for an
efficient and successful audit. The audit plan
should be made available to the organization
audited, with adequate lead time to ensure
that appropriate personnel and documents are
available for the audit. Three months prior to
the audit, the audit lead should contact the
quality assurance officer (QAO) of the
organization to be audited to coordinate
specific dates and schedules for the on-site
audit visit. During this initial contact, the
audit lead should arrange a tentative schedule for meetings with key personnel as well as for
inspection of selected ambient air quality monitoring and measurement operations. At the same time,
a schedule should be set for the exit interview used to debrief the monitoring organization director or
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 10 of 14
his/her designee, on the systems audit outcome. As part of this scheduling, the audit lead should indicate
any special requirements such as access to specific areas or activities. The audit lead should inform the
monitoring organization QAO that the QAO will receive a questionnaire, which is to be reviewed and
completed.
Table 15-2 Suggested Elements of an Audit Plan
Audit Title - Official title of audit that will be used on checksheets and reports
Date/Audit # Year and number of audit can be combined; 08-1, 08-2
Scope - Establishes the boundary of the audit and identifies the groups and activities to be evaluated.
The scope can vary from general overview, total system, to part of system, which will
determine the length of the audit.
Purpose - What the audit should achieve
Standards - Standards are criteria against which performance is evaluated. These standards must be clear
and concise and should be used consistently when auditing similar facilities or procedures. The
use of audit checklists is suggested to assure that the full scope of an audit is covered. An
example checklist for the Regional TSA is found in Appendix H.
Audit team - Team lead and members.
Auditees - People who should be available for the audit from the audited organization. This should include
the program manager(s), principal investigator(s), monitoring leads, organizations QA
representative(s), and other management and technicians as necessary.
Documents - Documents that should be available in order for the audit to proceed efficiently. Too often
documents are asked for during an audit, when auditors do not have the time to wait for these
documents to be found. Documents could include QMPs, QAPPs, SOPs, GLPs, control charts,
raw data, QA/QC data, previous audit reports etc.
Timeline - A timeline of when organizations (auditors/auditees) will be notified of the audit in order for
efficient scheduling and full participation of all parties.
The audit lead should emphasize that the completed questionnaire is to be returned within one (1) month
(or time frame deemed appropriate) of receipt. The information within the questionnaire is considered a
minimum, and both the Region and the monitoring organization under audit should feel free to include
additional information. Once the completed questionnaire has been received, it should be reviewed and
compared with the pertinent criteria and regulations. The AQS precision, bias and completeness data as
well as any other information on data quality can augment the documentation received from the reporting
organization under audit. This preliminary evaluation will be instrumental in selecting the sites to be
evaluated and in the decision on the extent of the monitoring site data audit. The audit team should then
prepare a checklist detailing specific points for discussion with monitoring organization personnel.
The audit team should be made up of several members to offer a wide variety of backgrounds and
expertise. This team may then divide into groups once on-site, so that both audit coverage and time
utilization can be optimized. A possible division may be that one group assesses the support laboratory
and headquarters operations while another evaluates sites, and subsequently assesses audit and calibration
information. The audit lead should confirm the proposed audit schedule with the audited organization
immediately prior to traveling to the site.
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 11 of 14
15.3.2. On-Site Activities
The audit team should meet initially
with the audited monitoring
organizations director or his/her
designee to discuss the scope,
duration, and activities involved with
the audit. This should be followed by
a meeting with key personnel
identified from the completed
questionnaire, or indicated by the
monitoring organization QAO. Key
personnel to be interviewed during
the audit are those individuals with
responsibilities for: planning, field
operations, laboratory operations,
QA/QC, data management and
reporting. At the conclusion of these
introductory meetings, the audit team
may begin work as two or more
independent groups, as illustrated in
Figure 15.3. To increase uniformity
of site inspections, it is suggested that
a site checklist be developed and
used. The format for Regional TSAs
can be found in Appendix H.
The importance of the audit of data
quality (ADQ) cannot be overstated.
Thus, sufficient time and effort should
be devoted to this activity so that the audit team has a clear understanding and complete documentation of
data flow. Its importance stems from the need to have documentation on the quality of ambient air
monitoring data for all the criteria pollutants for which the monitoring organization has monitoring
requirements. The ADQ will serve as an effective framework for organizing the extensive amount of
information gathered during the audit of laboratory, field monitoring and support functions within the
monitoring organization.
The entire audit team should prepare a brief written summary of findings, organized into the following
areas: planning, field operations, laboratory operations, quality assurance/quality control, data
management, and reporting. Problems with specific areas should be discussed and an attempt made to
rank them in order of their potential impact on data quality. For the more serious problems, audit findings
should be drafted (Fig. 15.4).
The audit finding form has been designed such that one is filled out for each major deficiency that
requires formal corrective action. They inform the monitoring organization being audited about a serious
finding that may compromise the quality of the data and therefore require specific corrective actions.
They are initiated by the audit team, and discussed at the debriefing.
During the debriefing discussion, evidence may be presented that reduces the significance of the finding;
in which case the finding may be removed. If the audited monitoring organization is in agreement with
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 12 of 14
the finding, the form is signed by the monitoring
organization's director or his/her designee during the
exit interview. If a disagreement occurs, the QA
Team should record the opinions of the monitoring
organization audited and set a time at some later date
to address the finding at issue.
The audit is now completed by having the audit team
members meet once again with key personnel, the
QAO and finally with the monitoring organization's
director to present their findings. This is also the
opportunity for the monitoring organization to
present their disagreements.
The audit team should simply state the audit results,
including an indication of the potential data quality
impact. During these meetings, the audit team
should also discuss the systems audit reporting
schedule and notify monitoring organization
personnel that they will be given a chance to
comment in writing, within a certain time period, on
the prepared audit report in advance of any formal
distribution.
15.3.3 Post-Audit Activities
The major post-audit activity is the preparation of the
systems audit report. The report will include:
audit title, number and any other identifying
information;
audit team leaders, audit team participants
and audited participants;
background information about the project,
purpose of the audit, dates of the audit,
particular measurement phase or parameters
that were audited, and a brief description of
the audit process;
summary and conclusions of the audit and
corrective action requirements; and
attachments or appendices that include all
audit evaluations and audit finding forms.
To prepare the report, the audit team should meet
and compare observations with collected documents
and results of interviews and discussions with key
personnel. Expected QA project plan implementation
is compared with observed accomplishments and
deficiencies and the audit findings are reviewed in
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 13 of 14
detail. Within thirty (30)
calendar days of the completion
of the audit, the audit report
should be prepared and
submitted.
The TSA report is submitted to
the audited monitoring
organization. It is suggested
that a cover letter be used to
reiterate the fact that the audit
report is being provided for
review and written comment.
The letter should also indicate
that, should no written
comments be received by the
audit lead within thirty (30)
calendar days from the report
date, it will be assumed
acceptable to the monitoring
organization in its current form,
and will be formally distributed
without further changes.
If the monitoring organization
has written comments or
questions concerning the audit
report, the audit team should
review and incorporate them as
appropriate, and subsequently prepare and resubmit a report in final form within thirty (30) days of
receipt of the written comments. Copies of this report should be sent to the monitoring organization
director or his/her designee for internal distribution. The transmittal letter for the amended report should
indicate official distribution and again draw attention to the agreed-upon schedule for corrective action
implementation.
15.3.4 Follow-up and Corrective Action Requirements
As part of corrective action and follow-up, an audit finding response form (Fig 15.6) is generated by the
audited organization for each finding form submitted by the audit team. The audit finding response form
is signed by the audited organizations director and sent to the organization responsible for oversight who
reviews and accepts the corrective action. The audit response form should be completed by the audited
organization within 30 days of acceptance of the audit report.
15.3.5 TSA Reporting to AQS
All 40 CFR Appendix A required TSAs will be reported to AQS. In 2013, a QA transaction was
developed to allow the reporting of 5 parameters: 1) the monitoring organization audited, 2) the auditing
agency, 3) the begin and 4) end date of the audit, and 5) the close out date. The close out date is defined
as the date when all corrective actions identified in the audit were completed.
Audit Finding Response Form
Audit Title: Audit #: Finding #:
Finding:
Cause of the problem:
Actions taken or planned for correction:
Responsibilities and timetable for the above actions:
Prepared by: Date:
Reviewed by: Date:
Remarks:
Is this audit finding closed? When?
File with official audit records. Send copy to auditee
Figure 15.6 Audit response form
QA Handbook Vol II, Section 15.0
Revision No: 0
Date: 05/13
Page 14 of 14
15.4 Data Quality Assessments
A data quality assessment (DQA) is the statistical analysis of environmental data, to determine whether
the quality of data is adequate to support the decisions which are based on the DQOs. Data are
appropriate if the level of uncertainty in a decision, based on the data, is acceptable. The DQA process is
described in detail in the guidance document: Data Quality Assessment: A Reviewers Guide (EPA QA/G-
9R), in Section 18 and is summarized below.
1) Review the data quality objectives (DQOs) and sampling design of the program: review the DQO
and develop one, if it has not already been done. Define statistical hypothesis, tolerance limits,
and/or confidence intervals.
2) Conduct preliminary data review: Review QA data and other available QA reports, calculate
summary statistics, and develop plots/graphs. Look for patterns, relationships, or anomalies.
3) Select the statistical test: select the best test for analysis based on the preliminary review, and
identify underlying assumptions about the data for that test.
4) Verify test assumptions: decide whether the underlying assumptions made by the selected test hold
true for the data and the consequences.
5) Perform the statistical test: perform test and document inferences. Evaluate the performance for
future use.
EPA QA/G-9S, a companion document to EPA QA/G-9R, provides many appropriate statistical tests.
Both can be found on the EPA Quality Staffs Website
13
.
OAQPS plans on performing data quality assessments for the pollutants of the Ambient Air Quality
Monitoring Network at a yearly frequency for data reports and at a 3-year frequency for more
interpretative reports. Currently EPA produces annual box and whisker plots of the gaseous pollutants
titled: Criteria Pollutant Quality Indicator Summary Report that are posted on AMTIC
14
and develops 3-
year QA reports for PM
2.5
15
. As more QA data becomes accessible and improvements are made in
reporting and assessment technologies, EPA hopes to develop a library of reports that users can run at
more frequent intervals. Monitoring organizations are encouraged to implement data quality assessments
for their data.
Data not meeting DQOs does not necessarily invalidate this data but it means that those using the
information for NAAQS decisions or for other objectives have a higher probability of making an incorrect
decision (declaring an area attainment when it should in truth be non-attainment or declaring an area non-
attainment when in truth it is in attainment). These types of errors can have serious financial and health
risk consequences and monitoring organizations not meeting DQOs should make every effort to discover
the reasons for the measurement uncertainties in their monitoring networks. EPA Regions or the
monitoring organization QA staff may want to revise TSA schedules based on the results from data
quality assessment.
13
http://www.epa.gov/quality1/
14
http://www.epa.gov/ttn/amtic/qareport.html
15
http://www.epa.gov/ttn/amtic/anlqa.html
QA Handbook Vol II, Section 16.0
Revision No: 0
Date: 05/13
Page 1 of 4
16.0 Reports to Management
This section provides guidance and suggestions to air monitoring organizations on how to report the
quality of the aerometric data, and how to convey information and requests for assistance concerning
quality control and quality assurance problems. The guidance offered here is primarily intended for
PQAOs that provide data to one or more of these national networks:
SLAMS (State and Local Air Monitoring Stations)
Tribal Monitoring Stations
PAMS (Photochemical Air Monitoring Stations)
PSD (Prevention of Significant Deterioration stations)
NCore (National Core Monitoring Network)
Chemical Speciation Network
NATTS (National Air Toxic Trend Stations)
This guidance may also be useful in preparing reports that summarize data quality of other pollutant
measurements such as those made at Special Purpose Monitoring Stations (SPMS) and state-specific
programs.
Several kinds of reports can be prepared. The size and frequency of the reports will depend on the
information requested or to be conveyed. A brief, corrective action form or letter-style report might ask
for attention to an urgent problem. On the other hand, an annual quality assurance report to management
would be a much larger report containing sections such as:
executive summary,
network background and present status,
quality objectives for measurement data,
quality assurance procedures,
results of quality assurance activities,
recommendations for further quality assurance work, and
suggestions for improving performance that include items such as fixing equipment problems,
personnel training need, and infrastructure improvements
A report to management should not solely consist of tabulations of analyzer-by-analyzer precision and
bias check results for criteria pollutants. This information is required to be submitted with the data each
quarter and is thus already available to management through AQS. Instead, the annual quality assurance
report to management should summarize and discuss the results of such checks. These summaries from
individual PQAOs can be incorporated into additional reports issued by the state, local, tribal and/or the
EPA Regional Office.
This section also provides general information for the preparation of reports to management and includes:
the types of reports that might be prepared, the general content of each type of report, and a
suggested frequency for their preparation,
sources of information that can be tapped to retrieve information for the reports, and
techniques and methods for concise and effective presentation of information.
Appendix I presents examples of two types of reports to management; the annual quality assurance report
to management and a corrective action request.
QA Handbook Vol II, Section 16.0
Revision No: 0
Date: 05/13
Page 2 of 4
16.1 Guidelines for Preparation of Reports to Management
16.1.1 Types of QA Reports to Management
Listed in Table 16-1 are examples of typical QA reports to management. An individual reporting
organization may have others to add to the list or may create reports that are combinations of those listed
below.
Table 16-1 Types of QA Reports to Management
Type of QA Report
to Management
Contents
Suggested Reporting Frequency
As
required
Week Month Quarter Year
Corrective action
request
Description of problem; recommended
action required; feedback on resolution
of problem.
x
Control chart with
summary
Repetitive field or lab activity; control
limits versus time. Prepare monthly or
whenever new check or calibration
samples are used.
x x x x
National Performance
Evaluation Program
results
Summary of PEP,NPAP, NATTS PT
and CSN audit results.
x x
State and local
organization
performance audits
Summary of audit results;
recommendations for action, as needed.
x x
Technical systems
audits
Summary of system audit results;
recommendations for action, as needed.
x x
Quality assurance
report to management
Executive summary. Precision, bias, and
system and performance audit results.
x x
Network reviews (by
EPA Regional
Office)
Review results and suggestions for
actions, as needed.
x
16.1.2 Sources of Information
Information for inclusion in the various reports to management may come from a variety of sources
including: records of precision and bias checks (AMP255 reports), results of systems and performance
audits, laboratory and field instrument maintenance logbooks, and NPAP audits. Table 16-2 lists useful
sources and the type of information expected to be found.
Table 16-2 Sources of Information for Preparing Reports to Management
Information Source Expected Information and Usefulness
Location
State implementation plan
Types of monitors, locations, and sampling
schedule.
http://www.epa.gov/airquality/urbanair
/sipstatus/overview.html
Annual Network Plans
Provides for locations of networks and
objectives of monitoring sites.
http://www.epa.gov/ttn/amtic/plans.ht
ml
Quality management plans and
quality assurance project plans
Data quality indicators and goals for
precision, bias, completeness, timeliness.
On file at monitoring organization and
in most cases EPA Regional Offices.
Quality objectives for measurement
data document
Quality objectives for measurement data.
Audit procedures and frequency.
Most criteria pollutants posted in CFR.
Some under criteria pollutant QA site.
http://www.epa.gov/ttn/amtic/qapolluta
nt.html
QA Handbook Vol II, Section 16.0
Revision No: 0
Date: 05/13
Page 3 of 4
Information Source Expected Information and Usefulness
Location
Laboratory and field instrument
maintenance logbooks
Record of maintenance activity, synopsis of
failures, recommendations for equipment
overhaul or replacement.
Internal monitoring organization
documents
Laboratory weighing room records
of temperature, humidity
A record of whether or not environmental
control in the weighing room is adequate to
meet goals.
Internal monitoring organization
documents
Audit results (NPAP, local, etc.)
Results of audit tests on ambient air
pollutant measurement devices.
AQS data base
Quality control data on local
information management systems or
AQS
Results are generally considered valid and
can be used to determine achievement of
data quality objectives.
AQS data base
16.1.3 Methods of Presenting Information
Reports to management are most effective when the information is given in a succinct, well-summarized
fashion. Methods useful for distilling and presenting information in ways that are easy to comprehend are
listed in Table 16-3. A 2008 Guidance Document, designed to assist Tribes in developing monitoring
programs contains an expanded section (Section 7) that discusses many of the statistical techniques
described in Table 16-3
1
. Several of these methods are available on-line in AirData
2
; others are available
in commercially available statistical and spreadsheet computer programs.
Table 16-3 Presentation Methods for Use in Reports to Management
Presentation Method Typical Use Examples
Written text Description of results and responses to
problems
Appendix I
Control chart Shows whether a repetitive process stays
within QC limits.
Figure 10.4 of this Handbook
Black box report Visually highlights information by color
coding boxes to indicate where project
goals, DQOs, etc were/were not met
Executive Summary of Appendix I.
Three year PM
2.5
QA Reports on
AMTIC
Bar charts Shows relationships between numerical
values.
Included in most graphic and
spreadsheet programs
X Y (scatter) charts Shows relationships between two variables. Included in most graphic and
spreadsheet programs
Probability limit charts and box and
whisker plots
Show a numerical value with its associated
precision range.
Figure 1 of Appendix I
16.1.4 Annual Quality Assurance Report
The annual quality assurance report (an example is provided in Appendix I) should consist of a number of
sections that describe the quality objectives for measurement data and how those objectives have been
met. A suggested organization might include:
Executive Summary of Report to Management - The executive summary should be a short section
(typically one or two pages) that summarizes the annual quality assurance report to management. It
should contain a checklist graphic that lets the reader know how the reporting organization has met
its goals for the report period. In addition, a short discussion of future needs and plans should be
included.
1
Technical Guidance for the Development of Tribal Monitoring Programs
http://www.epa.gov/ttn/oarpg/t1/memoranda/techguidancetribalattch.pdf
2
http://www.epa.gov/airdata/
QA Handbook Vol II, Section 16.0
Revision No: 0
Date: 05/13
Page 4 of 4
Introduction - This section describes the quality objectives for measurement data and serves as an
overview of the reporting organizations structure and functions. It also briefly describes the
procedures used by the reporting organization to assess the quality of field and laboratory
measurements.
Quality Information for each Ambient Air Pollutant Monitoring Program - These sections are
organized by ambient air pollutant category (e.g., gaseous criteria pollutants, air toxics). Each
section includes the following topics:
program overview and update
quality objectives for measurement data
data quality assessment
16.1.5 Corrective Action Request
A corrective action request should be made whenever anyone in the monitoring organization notes a
problem that demands either immediate or long-term action to correct a safety defect, or an operational
problem (either instrument malfunctions or procedural errors). A typical corrective action request form,
with example information entered, is shown in Appendix I. A separate form should be used for each
problem identified.
The corrective action report form is designed as a closed-loop system. First it identifies the originator; the
person who reports and identifies the problem, states the problem and may suggest a solution. The form
then directs the request to a specific person or persons (i.e., the recipient), who would be best qualified to
fix the problem. Finally, the form closes the loop by requiring that the recipient state how the problem
was resolved and the effectiveness of the solution. The form is signed and a copy is returned to the
originator and other copies are sent to the supervisor and the applicable files for the record. The concepts
of the corrective action requests and form apply to either hardcopy or electronic processing of this
information. Laboratory/monitoring organization information management systems may be capable of
implementing this process.
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 1 of 10
17.0 Data Review, Verification and Validation
Data review, verification and validation are techniques used to accept, reject or qualify data in an
objective and consistent manner. Verification can be defined as confirmation, through provision of
objective evidence that specified requirements have been fulfilled
1
. Validation can be defined as
confirmation through provision of objective evidence that the particular requirements for a specific
intended use are fulfilled. So, for example one could verify that for a monitor all 1-point QC were
performed every two weeks (specified requirement) as described in standard operating procedures
(specified requirement). However, if the checks were higher than expected, the validation process might
determine that the data could not be used for NAAQS determinations (intended use). It is important to
describe the criteria for deciding the degree to which each data item has met its quality specifications as
described in an organizations QAPP. This section will describe the techniques used to make these
assessments.
In general, these assessment activities are performed by persons implementing the environmental data
operations as well as by personnel independent of the operation, such as the organizations QA
personnel, and at some specified frequency. The procedures, personnel and frequency of the assessments
should be included in an organizations QAPP. These activities should occur prior to submitting data to
AQS and prior to final data quality assessments that will be discussed in Section 18.
Each of the following areas of discussion described below should be considered during the data
review/verification/validation processes. Some of the discussion applies to situations in which a sample
is collected and transported to a laboratory for analysis and data generation; others are applicable to
automated instruments. The following information is an excerpt from EPA G-5
2
:
Sampling Design - How closely a measurement represents the actual environment at a given time and
location is a complex issue that is considered during development of the sampling design. Each sample
should be checked for conformity to the specifications. By noting the deviations in sufficient detail,
subsequent data users will be able to determine the datas usability under scenarios different from those
included in project planning. Deviations should be noted on sample documentation (i.e., chain of custody
forms, field data forms or log books) in a manner conducive to subsequent data entry. For example,
development of a detailed set of data qualifiers (flags) makes data aggregation and assessment in
information management systems much easier, can help identify how often a qualifier is used and whether
the identified deviation has an effect on data quality.
Sample Collection Procedures- Details of how a sample is collected are important for properly
interpreting the measurement results. Sampling methods and field SOPs provide these details, which
include sampling and ancillary equipment and procedures (including equipment decontamination).
Acceptable departures (for example, alternate equipment) from the QAPP, and the action to be taken if
the requirements cannot be satisfied, should be specified for each critical criterion. Validation activities
should note potentially unacceptable departures from the QAPP. Comments or findings on deviations
from written sampling plans made during field technical systems audits or reviews should be noted.
Sample Handling- Details of how a sample is physically treated and handled during transportation to and
from the field site, through all laboratory handling stages prior to final analysis/reporting, are extremely
1
American National Standard Quality Systems of Environmental Data and Technology Programs ANSI/ASQ E4-
2004 http://ansi.org/
2
EPA Guidance to Quality Assurance Project Plans http://www.epa.gov/quality1/qs-docs/g5-final.pdf
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 2 of 10
important. Correct interpretation of the subsequent measurement results requires that deviations from the
sample handling section of the QAPP/SOPs, and the actions taken to minimize or control the changes, be
detailed. Data collection SOPs should indicate events that occur during sample handling that may affect
the integrity of the samples. At a minimum, those responsible for reviewing/verifying/validating should
confirm that the appropriate sample containers and the preservation methods are appropriate to the nature
of the sample and the type of data generated from the sample. Checks on the identity of the sample (e.g.,
proper labeling and chain of custody records) as well as proper physical/chemical storage conditions (e.g.,
chain of custody and storage records) should be made to ensure that the sample continues to be
representative of its native environment as it moves through the analytical process.
Analytical Procedures- Each sample should be verified to ensure that the procedures used to generate
the data were implemented as specified. Acceptance criteria should be developed for important
components of the procedures, along with suitable codes (qualifiers) for characterizing each sample's
deviation from the procedure. Data validation activities should determine how seriously a sample
deviated beyond the acceptable limit so that the potential effects of the deviation can be evaluated during
the DQA.
Quality Control- The quality control section of the QAPP specifies the QC checks that are to be
performed during sample collection, handling and analysis. These include analyses of check standards,
blanks and replicates, which provide indications of the quality of data being produced by specified
components of the measurement process. For each specified QC check, the procedure, acceptance
criteria, and corrective action (and changes) should be specified. Data validation should document the
corrective actions that were taken, which samples were affected, and the potential effect of the actions on
the validity of the data.
Calibration- Calibration of instruments and equipment and the information that should be presented to
ensure that the calibrations:
were performed before sampling began and at frequencies specified in the QAPP
were performed in the proper sequence (i.e., there may be a sequence of checks or other
implementation activities that must take place prior to calibration)
included the proper number of calibration points
were performed using standards that bracketed the range of reported measurement results
otherwise, results falling outside the calibration range should be flagged as such
had acceptable linearity checks and other checks to ensure that the measurement system was
stable when the calibration was performed
When calibration checks are found to be outside the acceptable limits proscribed in the QAPP, raw data
sampled between this calibration and the previous calibrations should be handled as described in the
QAPP. This could involve use of data flagging techniques for subsequent data evaluation.
Data Reduction and Processing- Data reduction/processing may be an irreversible process that involves
a loss of detail in the data and may involve averaging across time (i.e., 5-minute, hourly or daily averages)
or space (i.e., example, compositing results from samples thought to be physically equivalent). Since this
summarizing process produces few values to represent a group of many data points, its validity should be
well-documented in the QAPP. One can take a subset of raw data and perform the data reduction process
by hand to verify the reduction/processing techniques are performing as required in the QAPP and SOPs.
The information generation step involves the synthesis of the results of previous operations and the
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 3 of 10
construction of tables and charts suitable for use in reports. In many cases these types of reports are
generated on a frequent basis. A process should be developed that verifies that the reports are being
properly generated. This can include hand generating a subset of the report and reviewing and verifying
the programming code used to generate the reports.
17.1 Data Review Methods
The flow of data from the field environmental data operations to the storage in the database requires
several distinct and separate steps:
initial selection of hardware and software for the acquisition, storage, retrieval and transmittal of
data
organization and the control of the data flow from the field sites and the analytical laboratory
input and validation of the data
manipulation, analysis and archival of the data
submittal of the data into the EPAs AQS database.
More details of information management systems are included in Section 14. Both manual and computer-
oriented systems require individual reviews of all data tabulations. As an individual scans tabulations,
there is no way to determine that all values are valid. The purpose of manual inspection is to spot
unusually high (or low) values (outliers) that might indicate a gross error in the data collection system.
Manual review of data tabulations also allows detection of uncorrected drift in the zero baseline of a
continuous sensor. Zero drift may be indicated when the daily minimum concentration tends to increase
or decrease from the norm over a period of several days. For example, at most sampling stations the early
morning (3:00 a.m. to 4:00 a.m.) concentrations of carbon monoxide tend to reach a minimum (e.g., 2 to 4
ppm). If the minimum concentration differs significantly from this, a zero drift may be suspected. Zero
drift could be confirmed by review of zero control chart information.
In an automated data processing system, procedures for data validation can easily be incorporated into the
basic software. The computer can be programmed to scan data values for extreme values, outliers or
ranges. These checks can be further refined to account for time of day, time of week, and other cyclic
conditions. Questionable data values are then flagged to indicate a possible error. Other types of data
review can consist of preliminary evaluations of a set of data, calculating some basic statistical quantiles
and examining the data using graphical representations.
DAS Data Review
The data review is an ongoing process that is performed by the station operators (SO) and the data
processing team (DP). At a minimum a cursory review is performed daily, preferably in the morning to
provide a status of the data and instrument performance at monitoring sites. Detailed analysis can be
extremely difficult for the data processing team when reviewing the raw data without the notations, notes
and calibration information that the station operators provide for the group. The typical review process
for the station operator and data reviewer(s) include:
(SO) Review of zero, span, one point QC verification information, the hourly data, and any flags
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 4 of 10
that could effect data and record any information on the daily summaries that might be vital to
proper review of the data.
(SO) Transfer strip charts both analog and digital information, daily summaries, monthly
maintenance sheets, graphic displays of meta data and site log notes to the central location for a
secondary and more thorough review.
(SO) At the central location, review the data, marking any notations of invalidations and provide
electronic strip charts, meta data charts, daily summaries, site notes, and monthly maintenance
sheets for ready access by the data processing staff.
(DP) Review zero, span and one point QC verifications, station notes, and monthly maintenance
sheets for the month. Compare a defined number of hand reduced and/or strip chart readings to
electronic data points generated by the DAS. If significant differences are observed, determine
what corrective action is required.
Outliers
Outliers are measurements that are extremely large or small relative to the rest of the data and are
suspected of misrepresenting the population from which they were collected (EPAQA/G9R)
3
. When
reviewing data, some potential outliers will be obvious such as, spikes in concentrations, data remaining
the same for hours, or a sudden drop in concentration but still in the normal range of observed data. Many
of these outlier checks can be automated and provide efficient real-time checks of data. Outliers do not
necessarily indicate the data is invalid; they serve to alert the station operator and/or data reviewers there
may be a problem. In fact, the rule of thumb for outliers should be that the data be considered valid until
there is an explanation for why the data should be invalidated. At some point it may be necessary to
exclude outliers from instantaneous reporting to the AIRNow network and/or AQI reporting until further
investigation has occurred. EPA Guidance Documents
4
Guidance on Environmental Data Verification
and Validation (EPA QA/G8) and Guidance for Data Quality Assessment a Reviewers Guide (EPA
QA/G9R) provide insight on outlier and data reviews in general.
In order to recognize that the reported concentration of a given pollutant is extreme, the individual must
have basic knowledge of the major pollutants and of air quality conditions prevalent at the reporting
station. Data values considered questionable should be flagged for verification. This scanning for
high/low values is sensitive to spurious extreme values but not to intermediate values that could also be
grossly in error. If possible, use of statistical techniques to identify data anomalies and outliers are
encouraged since they provide a more consistent evaluation. Some of these techniques and checks may
be incorporated into data logging systems and well as main office information management systems.
NOTE: During submission of data to AQS a number of outlier (see outlier information below)
and gap checks are performed. The AQS website has documents describing these checks.
When an outlier is observed, a warning is generated and sent to the monitoring organization.
Monitoring organizations may ignore this warning and submit the data. However, it is
suggested that the data being reviewed. If the data is valid, a V qualifier can be added to the
data indicating the validity of the value. During automated annual data certification, any outlier
that does not have a V flag will be identified and will require the monitoring organizations to
review the data and either invalidate the data point or add a V qualifier. Therefore, EPA
3
http://www.epa.gov/quality1/qs-docs/g9r-final.pdf
4
http://www.epa.gov/quality1/qa_docs.html
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 5 of 10
suggests that it is better to review and validate outliers during initial reporting rather than delay
the certification process.
17.2 Data Verification Methods
Verification can be defined as confirmation, through provision of objective evidence that specified
requirements have been fulfilled
5
. The verification requirements for each data operation are included in
the organizations QAPP and in SOPs and should include not only the verification of sampling and
analysis processes but also operations like data entry, calculations and data reporting. The data
verification process involves the inspection, analysis, and acceptance of the field data or samples. These
inspections can take the form of technical systems audits (internal or external) or frequent inspections by
field operators and lab technicians. Questions that might be asked during the verification process include
but are not limited to:
5
http://www.epa.gov/quality/qa_docs.html Guidance on Environmental Data Verification and Data Validation
(QA/G-8)
Were the environmental data operations performed according to the SOPs governing those
operations?
Were the environmental data operations performed on the correct time and date originally
specified? Many environmental operations must be performed within a specific time frame; for
example, the NAAQS samples for some particulates are collected once every six days from
midnight to midnight. The monitor timing mechanisms must have operated correctly for the
sample to be collected within the time frame specified.
Did the sampler or monitor perform correctly? Individual checks such as leak checks, flow
checks, meteorological influences, and all other assessments, audits, and performance checks
must have been acceptably performed and documented.
Did the environmental sample pass an initial visual inspection? Many environmental samples can
be flagged (qualified) during the initial visual inspection.
Have manual calculations, manual data entry, or human adjustments to software settings been
checked? Automated calculations should be verified and accepted prior to use, but at some
frequencies these calculations should be reviewed to ensure that they have not changed.
17.3 Data Validation Methods
Data validation is a routine process designed to ensure that reported values meet the quality goals
of the environmental data operations. Data validation is further defined as examination and
provision of objective evidence that the particular requirements for a specific intended use are
fulfilled. A progressive, systematic approach to data validation must be used to ensure and assess
the quality of data. Effective data validation procedures usually are handled completely
independently from the procedures of initial data collection.
Because the computer can perform computations and make comparisons extremely rapidly, it can
also make some determination concerning the validity of data values that are not necessarily high
or low. Data validation SOPs are needed to ensure the validation process is consistently followed
within a monitoring organization. For example, one can evaluate the difference between
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 6 of 10
successive data values, since one would not normally expect very rapid changes in concentrations
of a pollutant during a 5-min or 1-h reporting period. When the difference between two
successive values exceeds a predetermined value, the tabulation can be flagged, with an
appropriate symbol.
Quality control data can support data validation procedures (see section 17.3.3). If data
assessment results clearly indicate a serious response problem with the analyzer, the agency
should review all pertinent quality control information to determine whether any ambient data, as
well as any associated assessment data, should be invalidated. Therefore, if ambient data are
determined to be invalid, the associated precision, bias and accuracy readings related to the
routine data should not be reported to AQS
6
. Section 17.3.4 provides additional guidance on how
to handle QC data when routine data are invalidated. Any data quality calculations using the
invalidated readings should be redone. Also, the precision, bias or accuracy checks should be
rescheduled, preferably in the same calendar quarter. The basis or justification for all data
invalidations should be permanently documented.
Measurement quality objectives, based upon requirements in CFR, QAPPs and SOPs, in
combination with field and laboratory technical expertise, may be used to invalidate a sample or
measurement. Many organizations use flags or result qualifiers to identify potential problems
with data or a sample. Flags can indicate the reason that a data value (a) did not produce a
numeric result, (b) produced a numeric result but it is qualified in some respect relating to the
type or validity of the result, or (c) produced a numeric result but for administrative reasons is not
to be reported outside the organization. Flags can be used both in the field and in the laboratory
to signify data that may be suspect due to contamination, special events or failure of QC limits.
Flags can be used to determine if individual samples (data), or samples from a particular
instrument, will be invalidated. In all cases, the sample (data) should be thoroughly reviewed by
the organization and invalidated only for cause (i.e. objective evidence can be found that it does
not fulfill the requirements for its intended use).
Flags may be used alone or in combination to invalidate samples. Since the possible flag
combinations can be overwhelming and can not always be anticipated, an organization needs to
review these flag combinations and determine if single values or values from a site for a
particular time period will be invalidated. The organization should keep a record of the
combination of flags that resulted in invalidating a sample or set of samples. These combinations
can be used to ensure that the organization evaluates and invalidates data in a consistent manner
and should be documented in the QAPP and updated as needed.
Procedures for screening data for possible errors or anomalies should also be implemented. The
data quality assessment document series (EPA QA/G-9R
7
, EPA QA/G-9s
8
) provide several
statistical screening procedures for ambient air quality data that should be applied to identify data
outliers.
6
See QA EYE Newsletter Issue #13 Page 6 http://www.epa.gov/ttn/amtic/files/ambient/qa/qanews13.pdf
7
Data Quality Assessment: A Reviewers Guide http://www.epa.gov/quality1/qs-docs/g9r-final.pdf
8
Data Quality Assessment: Statistical Methods for Practitioners http://www.epa.gov/quality1/qs-docs/g9s-final.pdf
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 7 of 10
NOTE: it is strongly suggested that flags, specifically the appropriate null data code flags,
replace any routine values that are invalidated when reporting data to AQS. This provides an
indication to data users/ assessors to the reasons why data that was expected to be collected
was missing. The actual data values and associated flags should remain in the monitoring
organizations local database.
17.3.1 Automated Methods
When zero, span or one-point QC checks exceed acceptance limits, ambient measurements should
be invalidated back to the most recent point in time where such measurements are known to be
valid. Usually this point is the previous check, unless some other point in time can be identified
and related to the probable cause of the excessive drift or exceedance (such as a power failure or
malfunction). Also, data following an analyzer malfunction or period of non-operation should be
regarded as invalid until the next subsequent acceptable check or calibration. Based on the
sophistication of DAS (see Section 14) monitoring organizations may have other automated
programs for data validation. These programs should be described in the monitoring
organizations approved QAPP prior to implementation. Even though the automated technique
may be considered acceptable, the raw invalidated data should be archived based on the statute of
limitations discussed in Section 5.
17.3.2 Manual Methods
For manual methods, the first level of data validation should be to accept or reject monitoring
data based upon results from operational checks selected to monitor the critical parameters in all
three major and distinct phases of manual methods--sampling, analysis, and data reduction. In
addition to using operational checks for data validation, observe all limitations, acceptance limits,
and warnings described in the reference and equivalent methods per se that may invalidate data. It
is further recommended that results from national performance evaluations required in 40 CFR
58, Appendix A not be used as thesole criteria for data invalidation because these checks are
performed fairly infrequently, not at every site and would result in a significant invalidation of
data depending on how the information was used. The performance evaluations are used to
provide an assessment of data comparability and bias at the PQAO level rather than an evaluation
of a particular monitor. So although a performance evaluation result might lead to a question
about the data quality of a particular monitor it is expected that other quality control data would
also be used in the data validation process.
17.3.3 Validation Templates
In J une 1998, a workgroup was formed to develop a procedure that could be used by monitoring
organizations that would provide for a consistent validation of PM
2.5
mass concentrations across
the US. The Workgroup developed three tables of criteria where each table has a different degree
of implication about the quality of the data. The criteria included on the tables are from 40 CFR
Part 50, Appendices L and N, 40 CFR Part 58, Appendix A, Method 2.12, and a few criteria that
are neither in CFR nor Method 2.12.
One of the tables has the criteria that must be met to ensure the quality of the data. An example
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 8 of 10
criterion is that the average flow rate for the sampling period must be maintained to within 5% of
16.67 liters per minute. The second table has the criteria that indicate that there might be a
problem with the quality of the data and further investigation is warranted before making a
determination about the validity of the sample or samples. An example criterion is that the field
filter blanks should not change weight by more than 30g between weighings. The third table
has criteria that indicate a potentially systematic problem with the environmental data collection
activity. Such systematic problems may impact the ability to make decisions with the data. An
example criterion is that at least 75% of the scheduled samples for each quarter should be
successfully collected and validated.
To determine the appropriate table for each criterion, the members of the workgroup considered
how significantly the criteria impact the resulting PM
2.5
mass. This was based on experience
from workgroup members, experience from non-workgroup members, and feasibility of
implementing the criterion.
Criteria that were deemed critical to maintaining the integrity of a sample or group of samples
were placed on the first table. Observations that do not meet each and every criterion on the
Critical Criteria Table should be invalidated unless there are compelling reason and justification
for not doing so. Basically, the sample or group of samples for which one or more of these
criteria are not met is invalid until proven otherwise. The cause of not operating in the acceptable
range for each of the violated criteria must be investigated and minimized to reduce the likelihood
that additional samples will be invalidated.
Criteria that are important for maintaining and evaluating the quality of the data collection system
are included on the second table, the Operational Criteria Table. Violation of a criterion or a
number of criteria may be cause for invalidation. The decision should consider other quality
control information that may or may not indicate the data are acceptable for the parameter being
controlled. Therefore, the sample or group of samples for which one or more of these criteria are
not met is suspect unless other quality control information demonstrates otherwise. The reason
for not meeting the criteria MUST be investigated, mitigated or justified.
Finally, those criteria which are important for the correct interpretation of the data but do not
usually impact the validity of a sample or group of samples are included on the third table, the
Systematic Criteria Table. For example, the data quality objectives are included in this table. If
the data quality objectives are not met, this does not invalidate any of the samples but it may
impact the error rate associated with the attainment/non-attainment decision.
NOTE: the designation Operational or Systematic do not imply that these quality
control checks need not be performed. If an operational or systematic quality control
check that is required by regulation is not performed it can be a basis for invalidation of all
associated data.
Based on the success and use of the PM
2.5
validation template, the Workgroup embarked on the
development of similar templates for the remaining criteria pollutants. Appendix D provides
templates for each criteria pollutant. The validation templates are based on the current state of
knowledge at the time of development of the Handbook. However, the templates will also be
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 9 of 10
placed as a standalone document on AMTIC
9
. The user is directed to AMTIC to determine if any
changes have occurred to the template after the date of the Handbook revision. A table will be
updated with any changes occurring after the date of this Handbook
The template will evolve as new information is discovered about the impact of the various
criterion on the error in the resulting concentration estimate. Interactions of the criteria, whether
synergistic or antagonistic, should also be incorporated when the impact of these interactions
becomes quantified. Due to the potential misuse of invalid data, data that are invalidated should
not be uploaded to AQS but should be retained on the monitoring organizations local database.
This data will be invaluable to the evolution of the validation template.
NOTE: Strict adherence to the validation templates is not required. They are meant to be a
guide based upon the knowledge of the Workgroup who developed them and may be a
starting point for monitoring organization specific validation requirement.
17.3.4 Reporting QC Data Relative to Data Validation
The intent of the QC data that are reported to the AQS is to provide an estimate of precision and
bias of the routine data collected during a particular time period. For example the 1-point QC
check is performed minimally every two weeks for the gaseous pollutants and so the data from
the check represents that the monitor was within acceptance specifications for that time period.
Upon failure of the QC checks and subsequent invalidation of the data (should that occur) it is
expected that null value codes would replace the routine data and the QC check would not be
reported to AQS. Since the routine data would not be available it would not be appropriate to
provide a QC value that would be used in overall estimates of precision and bias of that site. The
estimate of precision and bias for that site should represent the valid routine data being reported
for the site.
It is suggested that only those QC checks that are performed on each monitor/sampler are subject
to removal and only for the checks within the same time period that the routine data were
invalidated. As an example, if the Annual PE for ozone was performed in April, 2012 and the
ozone data for Dec, 2012 were invalidated, the April, 2012 PE could remain in AQS and only the
1-point QC checks for Dec. would be removed. Not all Appendix A checks fit nicely into this
paradigm. For example:
Collocated data- since they represent a PQAO and not an individual site it becomes more of a
dilemma. However if routine data from a collocated site were invalidated due to a finding based
on imprecision of the collocated data then one would not want to have these data represent the
other sites in the PQAO.
9
http://www.epa.gov/ttn/amtic/qalist.html
QA Handbook Vol II, Section 17.0
Revision No: 0
Date: 05/13
Page 10 of 10
NPAP and PEP data - Similar to the collocated data, this data represents the PQAO and is not
often used to invalidate data. However, there are cases where NPAP data have been used in
concert with other data quality information that led to the invalidation routine data and in that
case it would not be appropriate to report the NPAP results to AQS.
Other concerns might arise in connection with the annual PEs, or audits mentioned above.
Consistent with many agencies Quality Assurance Project Plans (QAPPs), data will not be
invalidated on the basis of an audit alone. Many agencies will verify, such as by independent
tests, the results of a failed audit. It might not be practical in all cases to verify an audit result,
immediately recalibrate the failed channel and schedule a second audit following the
recalibration. Accordingly, excluding the audit result that discovered a problem in the first place
could cause the responsible agency either to incur additional audit costs or, alternatively, be
penalized for appearing to fail to meet the required number of audits. Many agencies would be
concerned about having a less than complete audit count appear in the AMP255 at the time of
annual data certification.
As suggested above, monitoring agencies should keep in mind the objective of reporting the
results of QA and QC checks to AQS; representing the precision and bias of the reported raw
data. The analysts who report these data should be mindful that precision and bias calculations
can apply at the monitor level or at the PQAO level. Often, a result that falls outside criteria
indicates an out-of-control situation that is subsequently corrected such as by invalidating data
and recalibrating. Under other circumstances, after-the-fact review of QC checks with poor, but
passing, results might reveal a trend consistent with a problem that was only discovered by
some other means.
Because of concerns such as these, it is important to consider these recommendations in the
context of corrective action. It is recommended that QAPPs include wording that addresses when
to retain and when to exclude QA and QC data from AQS and when to conduct replacement
QA/QC checks. However, it is impossible to foresee every circumstance that might lead to a poor
QA/QC result and, in some cases, it might not be obvious whether to report or exclude a result. In
these cases decisions may fall to the responsible QA officers or managers. Discussions between
the EPA Region and monitoring organizations might also need to occur to determine the best
course of action.
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 05/13
Page 1 of 11
18.0 Reconciliation with Data Quality Objectives
Section 3 described the data quality objective (DQO) process, which is an important planning tool to
determine the objectives of an environmental data operation, to understand and agree upon the allowable
uncertainty in the data and, with that, to optimize the sampling design. This information, along with
sampling and analytical methods and appropriate QA/QC, should be documented in an organizations
QAPP. The QAPP is then implemented by the monitoring organizations under the premise that if it is
followed, the DQOs should be met. Reconciliation with the DQO involves reviewing both routine and
QA/QC data to determine whether the DQOs have been attained and that the data are adequate for their
intended use. This process of evaluating the data against the DQOs has been termed data quality
assessment (DQA).
The DQA process has been developed for cases where formal DQOs have been established. These
procedures can also be used for data that do not formal DQOs but some idea of the decisions that will be
made with the data are needed. Guidance on the DQA process can be found in the documents titled Data
Quality Assessment: A Reviewers Guide (EPA QA/G-9R)
1
It has a companion document Data Quality
Assessment: Statistical Tools for Practitioners (EPA QA/G-9S)
2
that focuses on evaluating data for
fitness in decision-making and also provides many graphical and statistical tools.
As stated in EPA QA/G-9R Data quality, as a concept, is meaningful only when it relates to the intended
use of the data. By using the DQA Process, one can answer four fundamental questions:
1. Can the decision (or estimate) be made with the desired level of certainty, given the quality of the
data set?
2. How well did the sampling design perform?
3. If the same sampling design strategy is used again for a similar study, would the data be expected
to support the same intended use with the desired level of uncertainty?
4. Is it likely that sufficient samples were taken to enable the reviewer to see an effect if it was
really present?
The DQA is a key part of the assessment phase of the data life cycle (Figure 18.1), which is very similar
to the ambient air QA life cycle described in Section 1. As the part of the assessment phase that follows
data validation and verification, DQA determines how well the validated data can support their intended
use.
It is realized that some monitoring organizations may not have the statistical support available to use the
formal DQA process described below. The information below is intended to provide a good example of
the steps that would be followed for a formal DQA for those capable and interested in the approach. EPA,
through the development of the criteria pollutant DQOs and the assessments it produces through 3-year
QA reports, AQS AMP reports, and annual box and whisker plots attempts to provide information to
assist monitoring organizations in their data quality assessments. In addition, there are many software
packages available that can generate the statistics mentioned in the following DQA steps and there are a
number of internet sites that can be searched to inform one how to use these statistics. Some additional
guidance will be provided after the five step process that can be used to help evaluate data.
1
http://www.epa.gov/quality1/qs-docs/g9r-final.pdf
2
http://www.epa.gov/quality1/qs-docs/g9s-final.pdf
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 2 of 11
18.1 Five Steps of the DQA Process
As described in EPA QA/G-9R
1
and EPA QA/G-9S
2
, the DQA process is comprised of five steps. The
steps are detailed below. Since DQOs are available for the PM
2.5
program, they will be used as an
example for the type of information that might be considered in each step. The PM
2.5
information is
italicized and comes from a model PM
2.5
QAPP
3
for a fictitious PQAO called Palookaville. The model
QAPP was developed to help monitoring organizations develop QAPPs based upon the R-5 QAPP
requirements. Most of the information that follows will be provided verbatim from the Model QAPP.
However, notes will be added where updates, relative to the date of this Handbook, are needed.
The DQA discussed below is based on a 3 year assessment. The PM
2.5
DQOs were developed with goals
for a 3 year precision estimate of 10 percent coefficient of variation and a 3 year bias estimate of +10
percent. Some steps below may seem inefficient since monitoring organizations evaluate QC data on a
more frequent basis than every three years. However, the example below is used relative to the
achievement of the 3 year PM
2.5
DQO.
Figure 18.1 DQA in the context of data life cycle.
Step 1. Review DQOs and Sampling Design. Review the DQO outputs to assure that they are still
applicable. If DQOs have not been developed, specify DQOs before evaluating the data (e.g., for
environmental decisions, define the statistical hypothesis and specify tolerable limits on decision errors;
for estimation problems, define an acceptable confidence probability interval width). Review the
3
http://www.epa.gov/ttn/amtic/pmqa.html
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 3 of 11
sampling design and data collection documentation for consistency with the DQOs observing any
potential discrepancies.
The PM
2.5
DQOs define the primary objective of the PM
2.5
ambient air monitoring network (PM
2.5
NAAQS
comparison), translate the objective into a statistical hypothesis (3-year average of annual mean PM
2.5
concentrations less than or equal to 15 g/m
3
and 3-year average of annual 98th percentiles of the PM
2.5
concentrations less than or equal to 35 g/m
3
), and identify limits on the decision errors (incorrectly
conclude area in non-attainment when it truly is in attainment no more than 5% of the time, and
incorrectly conclude area in attainment when it truly is in non-attainment no more than 5% of the time).
The CFR contains the details for the sampling design, including the rationale for the design, the design
assumptions, and the sampling locations and frequency. If any deviations from the sampling design have
occurred, these will be indicated and their potential effect carefully considered throughout the entire
DQA.
NOTE: CFR now requires an annual air monitoring network plan
4
that may be helpful in the
evaluation of this step.
Step 2. Conduct Preliminary Data Review. Review QA reports, calculate basic statistics, and generate
graphs of data. Use this information to understand the structure of the data and identify patterns,
relationships, or potential anomalies.
A preliminary data review will be performed to uncover potential limitations of using the data, to reveal
outliers, and generally to explore the basic structure of the data. The first step is to review the quality
assurance reports
5
. The second step is to calculate basic summary statistics, generate graphical
presentations of the data, and review these summary statistics and graphs.
Review Quality Assurance Reports. Palookaville will review all relevant quality assurance reports that
describe the data collection and reporting process. Particular attention will be directed to looking for
anomalies in recorded data, missing values, and any deviations from standard operating procedures.
This is a qualitative review. However, any concerns will be further investigated in the next two steps.
Calculation of Summary Statistics and Generation of Graphical Presentations. Palookaville will
generate prominent summary statistics for each of its primary and QA samplers. These summary
statistics will be calculated at the quarterly, annual, and three-year levels and will include only valid
samples. The summary statistics are:
Number of samples, mean concentration, median concentration, standard deviation, coefficient of
variation, maximum concentration, minimum concentration, interquartile range, skewness and
kurtosis.
These statistics will also be calculated for the percent differences at the collocated sites. The results will
be summarized in a table. Particular attention will be given to the impact on the statistics caused by the
observations noted in the quality assurance review. For example, Palookaville may evaluate the
4
Monitoring plans can be found on AMTIC at: http://www.epa.gov/ttn/amtic/plans.html
5
At the writing of the Handbook , the AQS system produces the AMP255 Data Quality Indicator report which is the
primary report for the assessment of quality assurance data for criteria pollutants.
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 4 of 11
influence of a potential outlier by evaluating the change in the summary statistics resulting from
exclusion of the outlier.
Palookaville will generate graphics to present the results from the summary statistics and show the
spatial continuity over the sample areas. Maps will be created for the annual and three-year means,
maxima, and interquartile ranges for a total of 6 maps. The maps will help uncover potential outliers and
will help in the network design review. Additionally, basic histograms will be generated for each of the
primary and QA samplers and for the percent difference at the collocated sites. The histograms will be
useful in identifying anomalies and evaluating the normality assumption in the measurement errors.
Step 3. Select the Statistical Test. Select the most appropriate procedure for summarizing and
analyzing the data based upon the reviews of the performance and acceptance criteria associated with the
DQOs, the sampling design, and the preliminary data review. Identify the key underlying assumptions
that must hold for the statistical procedures to be valid.
The primary objective for the PM
2.5
mass monitoring is determining compliance with the PM
2.5
NAAQS.
As a result, the null and alternative hypotheses are:
3 3
3 3
0
/ 35 / 15 :
/ 35 / 15 :
m g Y or m g X H
m g Y and m g X H
A
> >
s s
where X is the three-year average PM
2.5
concentration and Y is the three-year average of the annual 98th
percentiles of the PM
2.5
concentrations recorded for an individual monitor. The exact calculations for X
and Y are specified in 40 CFR Part 50, Appendix N. The null hypothesis is rejected; that is, it is
concluded that the area is not in compliance with the PM
2.5
NAAQS when the observed three-year
average of the annual arithmetic mean concentration exceeds 15.05 g/m
3
or when the observed
three-year average of the annual 98th percentiles exceeds 35.5 g/m
3
. If the bias of the sampler is 10%
and the precision is within 10%, then the error rates (Type I and Type II) associated with this statistical
test are less than or equal to 5%. The definitions of bias and precision will be outlined in the following
step.
Step 4. Verify Assumptions of Statistical Test. Evaluate whether the underlying assumptions hold, or
whether departures are acceptable, given the actual data and other information about the study.
The assumptions behind the statistical test include those associated with the development of the DQOs in
addition to the bias and precision assumptions. The method of verification will be addressed in this step.
Note that when less than three years of data are available, this verification will be based on as much data
as are available.
The DQO is based on the annual arithmetic mean NAAQS. For each primary sampler, Palookaville
will determine which, if either, of the PM
2.5
NAAQS concentration is violated. In the DQO development,
it was assumed that the annual standard is more restrictive than the 24-hour standard. If there are any
samplers that violate ONLY the 24-hour NAAQS, then this assumption is not correct. The seriousness of
violating this assumption is not clear. Conceptually, the DQOs can be developed based on the 24-hour
NAAQS and the more restrictive bias and precision limits selected. However, Palookaville will assume
the annual standard is more restrictive, until proven otherwise.
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 5 of 11
Normal distribution for measurement error. Assuming that measurement errors are normally
distributed is common in environmental monitoring. Palookaville has not investigated the sensitivity of
the statistical test to violate this assumption; although, small departures from normality generally do not
create serious problems. Instead, Palookaville will evaluate the reasonableness of the normality
assumption by reviewing a normal probability plot, and calculating the Shapiro-Wilk W Test statistic (if
sample size less than 50) or calculating the Kolmogorov-Smirnoff Test statistic (if sample size greater
than 50). All three techniques are provided by standard statistical packages. If the plot or statistics
indicate possible violations of normality, Palookaville may need to determine the sensitivity of the DQOs
to departures in normality.
Decision error can occur when the estimated 3-year average differs from the actual (true) 3-year
average. This is not really an assumption as much as a statement that the data collected by an ambient
air monitor is stochastic, meaning that there are errors in the measurement process, as mentioned in the
previous assumption.
The limits on precision and bias are based on the smallest number of required sample values in a 3-year
period. In the development of the DQOs, the smallest number of required samples was used. The reason
for this was to ensure that the confidence was sufficient in the minimal case; if more samples are
collected, then the confidence in the resulting decision will be even higher. For each of the samplers,
Palookaville will determine how many samples were collected in each quarter. If this number meets or
exceeds 12, then the data completeness requirements for the DQO are met.
The decision error limits were set at 5%. If the other assumptions are met, then the decision error limits
are less than or equal to 5%.
Measurement imprecision was established at 10% coefficient of variation (CV). For each sampler,
Palookaville will review the coefficient of variation calculated in Step 2. If any exceed 10%, Palookaville
may need to determine the sensitivity of the DQOs to larger levels of measurement imprecision.
Table 18-1 will be completed during each DQA. The table summarizes which, if any, assumptions have
been violated. A check will be placed in each of the row/column combinations that apply. Ideally, there
will be no checks. However, if there are checks in the table, the implication is that the decision error
rates are unknown, even if the bias and precision limits are achieved. As mentioned above, if any of the
DQO assumptions are violated, then Palookaville will need to reevaluate its DQOs.
Achievement of bias and precision limits. Lastly, Palookaville will check the assumption that at the
3-year level of aggregation, the sampler bias is within + 10% and precision is < 10%. The data from the
collocated samplers will be used to calculate quarterly, annual, and 3-year bias and precision estimates
even though it is only the 3-year estimates that are critical for the statistical test.
Since all the initial samplers being deployed by Palookaville will be FRMs, the samplers at each of the
collocated sites will be identical method designations. As such, it is difficult to determine which of the
collocated samplers is closer to the true PM
2.5
concentration. Palookaville will calculate an estimate of
precision. A bias measure will also be calculated, but it can only describe the relative difference of one
sampler to the other, not definitively indicate which sampler is closer to the true value. The following
paragraphs contain the algorithms for calculating precision and bias. These are similar, but differ
slightly, from the equations in 40 CFR Part 58, Appendix A.
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 6 of 11
Table 18-1 Summary of Violations of DQO Assumptions
Site
Violate 24-Hour
Standard ONLY?
Measurement Errors
Non-Normal?
Data Complete?
(- 12 samples per quarter)
Measurement CV
> 10%?
Primary Samplers
A1
A2
A3
A4
B1
QA Samplers
A1
B1
Before describing the algorithm, some ground work is necessary. When less than three years of
collocated data are available, then the three-year bias and precision estimates must be predicted.
Palookavilles strategy for accomplishing this will be to use all available quarters of data as the basis for
projecting where the bias and precision estimates will be at the end of the three-year monitoring period.
Three-year point estimates will be computed by weighting the quarterly components, using the most
applicable of the following assumptions:
1. Most recent quarters precision and bias are most representative of what the future quarters will
be.
2. All previous quarters precision and bias are equally representative of what the future quarters
will be.
3. Something unusual happened in the most recent quarter, so the most representative quarters are
all the previous ones, minus the most recent.
Each of these scenarios results in weights that will be used in the following algorithms. The weights are
shown in Table 18-2 where the variable Q represents the number of quarters for which observed bias and
precision estimates are available. Note that when Q=12, that is, when there are bias and precision
values for all of the quarters in the three-year period, then all of the following scenarios result in the
same weighting scheme.
Table 18-2 Weights for Estimating Three-Year Bias and Precision
Scenario Assumption Weights
1 Latest quarter most representative
w
q
= 12-(Q-1) for latest quarter,
w
q
=1 otherwise
2 All quarters equally representative w
q
= 12/Q for each quarter
3 Latest quarter unrepresentative
w
q
= 1 for latest quarter,
w
q
=11/(Q-1) otherwise
In addition to point estimates, Palookaville will develop confidence intervals for the bias and precision
estimates. This will be accomplished using a re-sampling technique. The protocol for creating the
confidence intervals are outlined in Box 18.1.
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 7 of 11
The algorithms for determining whether the bias and precision DQOs have been achieved for each
sampler follow:
Bias Algorithm
1. For each measurement pair, estimate the percent relative bias, d
i
.
( )
100
2 /
+
=
i i
i i
i
X Y
X Y
d %
where X
i
represents the concentration recorded by the primary sampler and Y
i
represents the
concentration recorded by the collocated sampler.
2. Summarize the percent relative bias to the quarterly level, D
j,q
, according to
=
=
q j
n
i
i
q j
q j
d
n
D
,
1 ,
,
1
where n
j,q
is the number of collocated pairs in quarter q for site j.
3. Summarize the quarterly bias estimates to the three-year level using
Box 18.1 Method for Estimating Confidence in Achieving Bias and Precision DQOs
Let Z be the statistic of interest (bias or precision). For a given weighting scenario, the re-sampling will be
implemented as follows:
1. Determine M, the number of collocated pairs per quarter for the remaining 12-Q quarters (default is M=15
or can use M=average number observed for the previous Q quarters.
2. Randomly select with replacement M collocated pairs per quarter for each of the future 12-Q quarters in a
manner consistent with the given weighting scenario.
Scenario 1: Select pairs from latest quarter only.
Scenario 2: Select pairs from any quarter.
Scenario 3: Select pairs from any quarter except the latest one.
Result from this step is complete collocated data for a three-year period, from which bias and precision
estimates can be determined.
3. Based on the filled-out three-year period from step 2, calculate three-year bias and precision estimate,
using Equation 1 where w
q
= 1 for each quarter.
4. Repeat steps 2 and 3 numerous times, such as 1000 times.
5. Determine P, the fraction of the 1000 simulations for which the three-year bias and precision criteria are
met. P is interpreted as the probability that the sampler is generating observations consistent with the
three-year bias and precision DQOs.
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 8 of 11
=
=
=
q
q
n
q
q
n
q
q j q
j
w
D w
D
1
1
,
Equation 18-1
where n
q
is the number of quarters with actual collocated data and w
q
is the weight for quarter q
as specified by the scenario in Table 18-2.
4. Examine D
j,q
to determine whether one sampler is consistently measuring above or below the
other. To formally test this, a non-parametric test will be used (Wilcoxon Signed Rank Test),
which is described in EPA QA/G-9S
2
. If the null hypothesis is rejected, then one of the samplers
is consistently measuring above or below the other. This information may be helpful in directing
the investigation into the cause of the bias.
Precision Algorithm
1. For each measurement pair, calculate the coefficient of variation, cv
i
,
2
i
i
d
v c =
2. Summarize the coefficient of variation to the quarterly level, CV
j,q
, according to
q j
n
i
i
q j
n
CV
CV
j
,
1
2
,
=
=
where n
j,q
is the number of collocated pairs in quarter q for site j.
3. Summarize the quarterly precision estimates to the three-year level using
( )
=
=
=
q
q
n
q
q
n
q
q j q
j
w
CV w
CV
1
1
2
,
^
Equation 18-2
where n
q
is the number of quarters with actual collocated data and w
q
is the weight for quarter q
as specified by the scenario in Table 24-2 (reference to Model QAPP).
4. If the null hypothesis in the Wilcoxon Signed Rank Test was not rejected, then the coefficient of
variation can be interpreted as a measure of precision. If the null hypothesis in the Wilcoxon
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 9 of 11
Ssigned Rank Test was rejected, the coefficient of variation has both a component representing
precision and a component representing the (squared) bias.
Confidence in Bias and Precision Estimates
1. Follow the method described in Box 18.1 to estimate the probability that the sampler is
generating observations consistent with the three-year bias and precision DQOs. The
re-sampling must be done for each collocated site.
Summary of Bias and Precision Estimation
The results from the calculations and re-sampling will be summarized in Table 18-3. There will be one
line for each site operating a collocated sampler.
Table 18-3 Summary of Bias and Precision
Collocated Three-year Bias Estimate
(Equation. 1)
Three-year Precision Estimate
(Equation. 2)
Null Hypothesis of Wilcoxon Test
Rejected?
P
(Box 18-1)
A1
B1
Step 5. Draw Conclusions from the Data. Perform the calculations required for the statistical test and
document the inferences drawn as a result of these calculations. If the design is to be used again, evaluate
the performance of the sampling design.
Before determining whether the monitored data indicate compliance with the PM
2.5
NAAQS, Palookaville
must first determine if any of the assumptions upon which the statistical test is based are violated. This
can be easily checked in Step 5 because of all the work done in Step 4. In particular, as long as
- in Table 18-1, there are no checks, and
- in Table 18-3,
o the three year bias estimate is in the interval [-10%,10%], and
o the three year precision estimate is less than or equal to 10%
then the assumptions underlying the test appear to be valid. As a result, if the observed three-year
average PM
2.5
concentration is less than 15 g/m
3
and the observed three-year average 98th percentile is
less than 35 g/m
3
, the conclusion is that the area seems to be in compliance with the PM
2.5
NAAQS, with
an error rate of 5%.
If any of the assumptions have been violated, then the level of confidence associated with the test is
suspect and will have to be further investigated.
What if the DQOs Are Not Met?
DQOs provide a goal on which to build a quality system. As the DQO process is developed, the EPA
identifies what are expected to be reasonable and achievable measurement quality objectives that if met it
can be assumed that the DQOs will be achieved. The DQA process is implemented to confirm the
achievement of the DQOs. However, achieving the DQOs does not equate to one hundred percent
certainty that every NAAQS decision (attainment, non-attainment) will be a correct decision. Even when
a DQO is achieved, the chances of making an incorrect decision increase as the data (e.g., design value)
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 10 of 11
get closer to the action limit (NAAQS). Similarly, if the DQOs are not met it does not mean that the
pollutant data cannot be used for NAAQS decisions; it means that the decision makers will have less
confidence that they will make the correct decision, especially around the action limit. Based on this
understanding of uncertainty EPA listed the DQOs as goals in CFR. Data quality indicator reports
demonstrate that these goals are being met for the majority of the monitoring organizations so they are
considered achievable. However if DQOs are developed and through assessments EPA finds that the
goals cannot be met then either the DQOs must be revised or new technologies (sampling or analytical
methods) must be developed to achieve the DQO.
DQA Tools
Over the years EPA has developed DQOs for each criteria pollutant as the criteria pollutant moved
through the NAAQS review process. In addition, monitoring organizations collect enough types of
QA/QC data to estimate the quality of their data and should be able to express the confidence in that
information. The following reports and tools can help monitoring organization assess the quality of their
information.
AMP255 Report
At a minimum the quality control information described in 40 CFR Part 58 Appendix A that is submitted
to AQS can be used to perform assessments of measurement uncertainty. The AMP255 report is the
most important QA report in AQS for the criteria pollutants. It provides an assessment of each quality
control sample based on the statistical criteria set forth in 40 CFR Part 58 Appendix A. It aggregates data
by PQAO and depending on the begin data and end date of the selected report, it will summarize data by
year as well as 3-year intervals. It will assess quality control data completeness as well as precision and
bias (depending on the type of quality control sample). A user ID is required to access AQS and data is
required to be loaded in AQS in order to run reports. This can be problematic based on the lag time of
information that is reported to AQS.
Data Assessment Statistical
Calculator (DASC) Tool
In order to provide monitoring
organizations access to CFR statistics
prior to submission to AQS, EPA
developed the DASC Tool. This
tool, developed in Microsoft Excel,
provides for local entry of QC data
and uses the same statistics provided
in 40 CFR Part 58 Appendix A. The
software and a guidance document
for its use can be found on AMTIC
6
6
Data Assessment Statistical Calculator http://www.epa.gov/ttn/amtic/qareport.html
QA Handbook Vol II, Section 18.0
Revision No: 0
Date: 5/13
Page 11 of 11
Annual Box and Whisker Plots
The AMP255 and DAC tools are very useful but EPA was also looking for more graphical ways to
display precision and bias data in order to assist monitoring organizations identify monitoring site in need
of corrective action. Each year, after the May certification, AQS develops the Annual Box and Whisker
Plots for the criteria gaseous pollutant data certified in May. Therefore, the report will be for the
previous year. Figure 18.3 provides an example of the report. The plots are created using the 1-point QC
checks for the gaseous pollutants or each site within the PQAO and include the same precision and bias
information that is generated AMP255 and well as the number of observations used in the assessment
(yellow band of data in Fig. 18.3). In addition, the graphical display can identify sites that are biased or
are variable. In the example below, all sites demonstrate acceptable precision with 2 sites showing an
acceptable but positive bias one site show no positive or negative bias (no sign) and one site shows an
acceptable negative bias. Information on how to assess the box and whisker information, as well as the
annual reports, are found on AMTIC
7
. At present, the report has not been automated so it is run by EPA
once a year. In the future, EPA hopes to automate the report for use at any time.
Figure 18.3 Example Box and Whisker Plots
7
Criteria Pollutant Quality Indicator Summary Report for AQS Data http://www.epa.gov/ttn/amtic/qareport.html
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 1 of 11
Appendix A
National Air Quality Monitoring Program Fact Sheets
The following information provides a fact sheet on a number of national ambient air
monitoring networks including:
State or Local Air Monitoring Stations (SLAMS) Network
National Core (NCore) Network
Photochemical Assessment Monitoring Stations (PAMS)
PM
2.5
Chemical Speciation Network (CSN)
National Toxics Trends Network (NATTS)
Interagency Monitoring of Protected Visual Environments (IMPROVE)
Clean Air Status and Trends Network (CASTNET)
National Atmospheric Deposition Network (NADP)
National Air Toxics Assessment (NATA)
Only the SLAMS, NCore, PAMS, CSN and NATTS pertain to the information
covered in the Handbook. The other networks described are for the benefit of the
reader.
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 2 of 11
Page intentionally left blank
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 3 of 11
State or Local Air Monitoring Stations (SLAMS) Network
Background
The SLAMS make up the ambient air quality monitoring sites that are operated by State or local agencies
for the primary purpose of comparison to the National Ambient Air Quality Standards (NAAQS), but may
serve other purposes such as:
provide air pollution data to the general public in a timely manner;
support compliance with air quality standards and emissions strategy development; and
support air pollution research studies.
The SLAMS network includes stations classified as NCore, PAMS, and Speciation, and formerly
categorized as NAMS, and does not include Special Purpose Monitors (SPM) and other monitors used for
non-regulatory or industrial monitoring purposes.
In order to support the objectives, the monitoring networks are designed with a variety of monitoring sites
that generally fall into the following categories which are used to determine:
1. the highest concentrations expected to occur in the area covered by the network;
2. typical concentrations in areas of high population density;
3. the impact on ambient pollution levels of significant sources or source categories;
4. the general background concentration levels;
5. the extent of regional pollutant transport among populated areas, and in support of secondary
standards; and
6. air pollution impacts on visibility, vegetation damage, or other welfare- based impacts.
The monitoring aspects of the SLAMS program are found in the Code of Federal Regulations, Title 40,
Parts 50, 53 and 58.
SLAMS must use approved Federal reference method (FRM), Federal equivalent method (FEM), or Approved
Regional Method (ARM) monitors for ambient pollutant levels being compared to the NAAQS.
Reference Category References Comments
Program References
40 CFR Part 50, 53 and 58
http://www.epa.gov/ttn/amtic/
Pollutants Measured
O
3
, CO, SO
2
, NO
2
PM
2.5
, PM
10
, Pb
Methods References 40 CFR Part 50 and 58 Appendix C
http://www.epa.gov/ttn/amtic/criteria.html
Must be FRM, FEM, or ARM for
NAAQS comparisons.
Website lists designated methods
Network Design References 40 CFR Part 58 Appendix D, E
Siting Criteria 40 CFR Part 58 Appendix E
Quality System References 40 CFR Part 58 Appendix A
http://www.epa.gov/ttn/amtic/quality.html
http://www.epa.gov/ttn/amtic/met.html
Website for QA Handbook Vol II
Eebsite for QA Handbook Vol IV
Data Management
References
http://www.epa.gov/ttn/airs/airsaqs/
Air Quality System
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 4 of 11
National Core (NCore) Network
Background
NCore is a multi pollutant network that integrates several advanced measurement systems for particles,
pollutant gases and meteorology. Most NCore stations have been operating since the formal start of the
network on J anuary 1, 2011. The NCore Network addresses the following objectives:
Timely reporting of data to public by supporting AIRNow, air quality forecasting, and other public
reporting mechanisms;
Support for development of emission strategies through air quality model evaluation and other
observational methods;
Accountability of emission strategy progress through tracking long-term trends of criteria and
non-criteria pollutants and their precursors;
Support for long-term health assessments that contribute to ongoing reviews of the NAAQS;
Compliance through establishing nonattainment/attainment areas through comparison with the
NAAQS;
Support to scientific studies ranging across technological, health, and atmospheric process
disciplines; and
Support to ecosystem assessments recognizing that national air quality networks benefit ecosystem
assessments and, in turn, benefit from data specifically designed to address ecosystem analyses.
The objective is to locate sites in broadly representative urban (about 55 sites) and rural (about 20 sites)
locations throughout the country to help characterize regional and urban patterns of air pollution.
In many cases, states will collocate these new stations with STN sites measuring speciated PM
2.5
components, PAMS sites already measuring O
3
precursors, and/or NATTS sites measuring air toxics. By
combining these monitoring programs at a single location, EPA and its partners will maximize the multi-
pollutant information available. This greatly enhances the foundation for future health studies, NAAQS
revisions, validation of air quality models, assessment of emission reduction programs, and studies of
ecosystem impacts of air pollution.
Reference Category References Comments
Program References
http://www.epa.gov/ttn/amtic/monitor.html
Pollutants Measured
SO
2
, CO, NO and NO
y
, and O
3
, PM
2.5
,
PM
10-2.5
, basic meteorological parameters
Methods References http://www.epa.gov/ttn/amtic/precur.html
http://www.epa.gov/ttn/amtic/ncore/guidance.html
Network Design
References
http://www.epa.gov/ttn/amtic/monstratdoc.html
Siting Criteria http://www.epa.gov/ttn/amtic/ncore/networks.html
Quality System References http://www.epa.gov/ttn/amtic/ncore/guidance.html
Data Management
References
http://www.epa.gov/ttn/amtic/ncore/guidance.html
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 5 of 11
Photochemical Assessment Monitoring Stations (PAMS)
Background
Section 182(c)(1) of the 1990 Clean Air Act Amendments (CAAA) require the Administrator to
promulgate rules for the enhanced monitoring of ozone, oxides of nitrogen (NOx), and volatile organic
compounds (VOC) to obtain more comprehensive and representative data on ozone air pollution.
Immediately following the promulgation of such rules, the affected states were to commence such actions
as were necessary to adopt and implement a program to improve ambient monitoring activities and the
monitoring of emissions of NOx and VOC. Each State Implementation Plan (SIP) for the affected areas
must contain measures to implement the ambient monitoring of such air pollutants. The subsequent
revisions to Title 40, Code of Federal Regulations, Part 58 (40 CFR 58) required states to establish
Photochemical Assessment Monitoring Stations (PAMS) as part of their SIP monitoring networks in ozone
nonattainment areas classified as serious, severe, or extreme.
The chief objective of the enhanced ozone monitoring revisions is to provide an air quality database that
will assist air pollution control agencies in evaluating, tracking the progress of, and, if necessary, refining
control strategies for attaining the ozone NAAQS. Ambient concentrations of ozone and ozone precursors
will be used to make attainment/nonattainment decisions, aid in tracking VOC and NOx emission inventory
reductions, better characterize the nature and extent of the ozone problem, and prepare air quality trends. In
addition, data from the PAMS will provide an improved database for evaluating photochemical model
performance, especially for future control strategy mid-course corrections as part of the continuing air
quality management process. The data will be particularly useful to states in ensuring the implementation
of the most cost-effective regulatory controls.
Reference Category References Comments
Program References
http://www.epa.gov/ttn/amtic/pamsrein.html
http://www.epa.gov/ttn/amtic/pamsmain.html
Pollutants Measured
Ozone, Nitrogen Oxides, VOCs, surface
meteorological
http://www.epa.gov/ttn/amtic/pamsguidance.html
Methods References
Network Design References http://www.epa.gov/ttn/amtic/pamssites.html
http://www.epa.gov/ttn/amtic/pamsguidance.html
Siting Criteria http://www.epa.gov/ttn/amtic/pamsguidance.html
Quality System References http://www.epa.gov/ttn/amtic/pamsdata.html
Data Management
References
http://www.epa.gov/ttn/amtic/pamsdata.html
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 6 of 11
PM
2.5
Chemical Speciation Network
Background
As part of the PM
2.5
National Ambient Air Quality Standards (NAAQS) review completed in 1997, EPA
established a PM
2.5
Chemical Speciation Network (CSN) consisting of Speciation Trends Network (STN)
sites and supplemental speciation sites. The CSN is a component of the National PM
2.5
Monitoring
Network. Although the CSN is intended to complement the activities of the much larger gravimetric PM
2.5
measurements network component (whose goal is to establish if the NAAQS are being attained), CSN data
is not used for attainment or nonattainment decisions. CSN data is used for multiple objectives, which
include:
The assessment of trends;
The development of effective State Implementation Plans (SIPs) and determination of regulatory
compliance;
The development of emission control strategies and tracking progress of control programs;
Aiding in the interpretation of health studies by linking effects to PM
2.5
constituents;
Characterizing annual and seasonal spatial variation of aerosols;
Comparison to chemical speciation data collected from the IMPROVE network.
As of 2012, the PM
2.5
Chemical Speciation Network includes about 50 STN sites and about 150 State and
Local Air Monitoring Stations (SLAMS) supplemental sites. All STN sites operate on a one-in-three day
sample collection schedule. The majority of the SLAMS supplemental sites operate on a one-in-six day
sample collection schedule. CSN sites collect aerosol samples over 24 hours on filters that are analyzed for
PM
2.5
mass, a number of trace elements, major ions (sulfate, nitrate, ammonium, sodium and potassium),
and organic and elemental carbon.
CSN data users include those at EPA seeking to determine concentration trends of PM
2.5
chemical species
over a period of 3 or more years and decision-makers at tribal, state and local levels who use the data as
input to models and for development of emission control strategies and determination of their long-term
effectiveness. Other users include public health officials and epidemiological researchers.
Reference Category References Comments
Program References
http://www.epa.gov/ttn/amtic/speciepg.html
Pollutants Measured
Mass, trace elements, ions, and organic and element carbon
Methods References http://www.epa.gov/ttn/amtic/specsop.html
http://www.epa.gov/ttn/amtic/spectraining.html
Network Design
References
http://www.epa.gov/ttn/amtic/specgen.html
Siting Criteria http://www.epa.gov/ttn/amtic/specgen.html
Quality System
References
http://www.epa.gov/ttn/amtic/specguid.html
Data Management
References
http://www.epa.gov/ttn/amtic/specdat.html
http://www.epa.gov/ttn/airs/airsaqs/detaildata/downloadaqsdata.htm
http://www.epa.gov/airdata/
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 7 of 11
National Toxics Trends Network (NATTS)
Background
The National Air Toxics Trends Station (NATTS) Network was developed to fulfill the need for long-term
HAP monitoring data of consistent quality. Among the principle objectives are assessing trends and
emission reduction program effectiveness, assessing and verifying air quality models (e.g., exposure
assessments, emission control strategy development, etc.), and as direct input to source-receptor models.
The current network configuration includes 27 sites (20 urban, 7 rural) across the United States; thirteen
sites were established in 2003, ten sites in 2004, and two sites each in 2007 and 2008. There are typically
over 100 pollutants monitored at each NATTS (though only 19 of those are required; included are VOCs,
carbonyls, PM10 metals, hexavalent chromium, and PAHs. Specifically, it is anticipated that the NATTS
data will be used for:
tracking trends in ambient levels to facilitate tracking progress toward emission and risk reduction
goals, which is the major objective of this program;
directly evaluating public exposure & environmental impacts in the vicinity of monitors;
providing quality assured data AT for risk characterization;
assessing the effectiveness of specific emission reduction activities; and
evaluating and subsequently improving air toxics emission inventories and model performance.
Currently the NATTS program is made up of 27 monitoring sites; representing urban (20) communities
and rural (7) communities.
Reference
Category
References Comments
Program
References
http://www.epa.gov/ttn/amtic/natts.html
Pollutants
Measured
33 HAPS which include metals, VOCs and carbonyls
http://www.epa.gov/ttn/amtic/files/ambient/airtox/nattsworkplante
mplate.pdf
Methods
References
http://www.epa.gov/ttn/amtic/airtox.html
http://www.epa.gov/ttn/amtic/files/ambient/airtox/nattsworkplante
mplate.pdf
Network Design
References
http://www.epa.gov/ttn/amtic/airtoxqa.html,
National Air Toxics Trends
Stations Quality
Management Plan final
09/09/05
Siting Criteria
40 CFR part 58 Appendix
E, PAMS Probe and Path
Siting Criteria
Quality System
References
http://www.epa.gov/ttn/amtic/airtoxqa.html
Data
Management
References
http://www.epa.gov/ttn/amtic/toxdat.html
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 8 of 11
Interagency Monitoring of Protected Visual Environments (IMPROVE)
Background
The Interagency Monitoring of Protected Visual Environments (IMPROVE) program is a cooperative
measurement effort governed by a steering committee composed of representatives from federal and
regional-state organizations. The IMPROVE monitoring program was established in 1985 to aid the
creation of Federal and State Implementation Plans for the protection of visibility in Class I areas (156
national parks and wilderness areas) as stipulated in the 1977 amendments to the Clean Air Act.
The objectives of IMPROVE are:
1. to establish current visibility and aerosol conditions in mandatory class I areas;
2. to identify chemical species and emission sources responsible for existing man-made visibility
impairment;
3. to document long-term trends for assessing progress towards the national visibility goal;
4. and with the enactment of the Regional Haze Rule, to provided regional haze monitoring
representing all visibility-protected federal class I areas where practical.
IMPROVE has also been a key participant in visibility-related research, including the advancement of
monitoring instrumentation, analysis techniques, visibility modeling, policy formulation and source
attribution field studies. In addition to 110 IMPROVE sites at visibility-protected areas, IMPROVE
Protocol sites are operated identically at locations to serve the needs of state, tribes and federal agencies.
Reference
Category
References Comments
Program
References
http://vista.cira.colostate.edu/improve/
http://vista.cira.colostate.edu/improve/Overview/IMPROVEP
rogram_files/frame.htm
Pollutants
Measured
PM
10
& PM
2.5
mass concentration, and PM
2.5
elements
heavier than sodium, anions, organic and elemental carbon
concentrations. Optical & met. parameters at select sites
All sites have aerosol speciation
monitoring by one day in three
24-hour duration sampling
Methods
References
http://vista.cira.colostate.edu/improve/Publications/IMPROV
E_SOPs.htm
Network Design
References
http://vista.cira.colostate.edu/improve/Publications/IMPROV
E_SOPs.htm
Siting Criteria http://vista.cira.colostate.edu/improve/Publications/IMPROV
E_SOPs.htm
Quality System
References
http://vista.cira.colostate.edu/improve/Data/QA_QC/qa_qc_B
ranch.htm
Data
Management
References
http://vista.cira.colostate.edu/improve/Data/data.htm
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 9 of 11
Clean Air Status and Trends Network (CASTNET)
Background
The Clean Air Status and Trends Network (CASTNET) is a national air quality monitoring network
designed to provide data to assess trends in air quality, atmospheric deposition, and ecological effects due
to changes in air pollutant emissions. CASTNET began collecting measurements in 1991 with the
incorporation of 50 sites from the National Dry Deposition Network, which had been in operation since
1987. CASTNET provides long-term monitoring of air quality in rural areas to determine trends in regional
atmospheric nitrogen, sulfur, and ozone concentrations and deposition fluxes of sulfur and nitrogen
pollutants in order to evaluate the effectiveness of national and regional air pollution control programs.
CASTNET operates more than 80 regional sites throughout the contiguous United States, Alaska, and
Canada. Sites are located in areas where urban influences are minimal. Ozone measurements became CFR
40 Part 58, Appendix A compliant in 2011. Meteorological measurements are made at approximately 30
sites, and are available for all sites prior 2010. Modeled dry deposition velocities are also provided.
The main objectives of the network are to:
1) track the effectiveness of national and regional scale emission control programs;
2) report high quality, publicly available data on the temporal and geographic patterns of air
quality and atmospheric deposition trends; and
3) provide the necessary information for understanding the environmental effects in sensitive
terrestrial and aquatic receptor areas associated with atmospheric loadings of pollutants.
Reference
Category
References Comments
Program
References
CASTNET Main Webpage
http://www.epa.gov/castnet/
CASTNET Annual Report
http://java.epa.gov/castnet/documents.do
Pollutants
Measured
sulfate, nitrate, ammonium, sulfur dioxide, nitric acid, base cations, ozone
CASTNET Factsheet http://java.epa.gov/castnet/documents.do
Methods
References
CASTNET Quality Assurance Project Plan (QAPP) Main Body
http://java.epa.gov/castnet/documents.do
Network
Design
References
CASTNET QAPP Main Body
http://java.epa.gov/castnet/documents.do
Siting
Criteria
CASTNET QAPP Main Body
http://java.epa.gov/castnet/documents.do
Quality
System
References
CASTNET QAPP Main Body
http://java.epa.gov/castnet/documents.do
Data
Management
References
CASTNET QAPP Appendix 6: CASTNET Data Operations Standard Operating
Procedures
http://java.epa.gov/castnet/documents.do
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 10 of 11
National Atmospheric Deposition Network (NADP)
Background
The National Atmospheric Deposition Program (NADP) provides quality-assured data and information in support
of research on the exposure of managed and natural ecosystems and cultural resources to acidic compounds,
nutrients, base cations, and mercury in precipitation. The NADP also provides data on ambient concentrations of
speciated mercury and gaseous ammonia. NADP data serve science and education and support informed decisions
on air quality issues related to precipitation and atmospheric chemistry.
The NADP operates three precipitation chemistry networks: the 250-station National Trends Network (NTN), 7-
station Atmospheric Integrated Research Monitoring Network (AIRMoN), and 100-station Mercury Deposition
Network (MDN) and two ambient monitoring networks: the 20-station Atmospheric Mercury Network (AMNet)
and the 50-station Ammonia Monitoring Network. The NTN provides the only long-term nationwide record of the
wet deposition of acids, nutrients, and base cations. NTN stations collect one-week precipitation samples in 48
states, Puerto Rico, the Virgin Islands, and Quebec Province, Canada. Complementing the NTN is the 7-station
AIRMoN. The daily precipitation samples collected at AIRMoN stations support continued research of
atmospheric transport and removal of air pollutants and the development of computer simulations of these
processes. The 100-station MDN offers the only regional measurements of mercury (Hg) in North American
precipitation. MDN data are used to quantify Hg deposition to water bodies that have fish and wildlife
consumption advisories due to this toxic chemical. The AMNet compliments the MDN by measuring speciated
hourly samples of ambient Hg at 25 monitoring stations. AMNet measurements are made using a Tekran
instrument which analyzes ambient samples for elemental, gaseous and particulate bound Hg fractions. The
AMoN is the only national monitoring network measuring ambient ammonia (NH
3
) concentrations. Bi-weekly
measurements of NH
3
compliment the NTN and CASTNET networks by filling a gap in the total nitrogen budget.
Work continues on developing routine model estimates for Hg and NH
3
bi-directional dry deposition velocities.
In addition to these long-term monitoring networks, the NADP is responsive to emerging issues requiring new or
expanded measurements. Its measurement system is efficient, its data meet pre-defined data quality objectives,
and its reports and products are designed to meet user needs.
Reference
Category
References Comments
Program
References
NADP http://nadp.isws.illinois.edu/
NTN http://nadp.isws.illinois.edu/NTN/
AIRMoN http://nadp.isws.illinois.edu/AIRMoN/
MDN http://nadp.isws.illinois.edu/MDN/
AMNet http://nadp.isws.illinois.edu/amn/
AMoN http://nadp.isws.illinois.edu/AMoN/
Pollutants
Measured
In precipitation: sulfate, nitrate, chloride, ammonium, calcium,
magnesium, sodium, potassium, pH, mercury
Ambient concentrations: speciated mercury, ammonia
Methods
References
http://nadp.isws.illinois.edu/lib/manuals/opman.pdf
http://nadp.isws.illinois.edu/lib/manuals/mdnopman.pdf
http://nadp.isws.illinois.edu/amn/docs/AMNet_Operations_Manual.pdf
Network
Design
References
http://nadp.isws.illinois.edu/lib/manuals/NADP_Site_Selection_and_Installation
_Manual.pdf
Siting
Criteria
http://nadp.isws.illinois.edu/lib/manuals/NADP_Site_Selection_and_Installation
_Manual.pdf
Quality
System
References
http://nadp.isws.illinois.edu/lib/qaPlans.aspx
http://nadp.isws.illinois.edu/lib/qaReports.aspx
Data
Management
References
http://nadp.isws.illinois.edu/lib/qaplans/NADP_Network_Quality_Assurance_Pl
an.pdf
http://nadp.isws.illinois.edu/amn/docs/AMNet_Data_Management_Manual.pdf
QA Handbook Volume II, Appendix A
Revision No. 0
Date:05/13
Page 11 of 11
National Air Toxics Assessment (NATA)
Background
NATA is a national-scale assessment of 33 air pollutants (a subset of 32 air toxics on the Clean Air Act's list of 188,
plus diesel particulate matter). The assessment considers the year 1996 (an update to 1999 is in preparation), including:
compilation of a national emissions inventory of air toxics emissions from outdoor sources;
estimates of ambient concentrations across the contiguous United States;
estimates of population exposures; and
characterizations of potential public health risks including both cancer and non-cancer effects.
NATA identifies those air toxics which are of greatest potential concern, in terms of contribution to population risk.
This information is relevant and useful in assessing risk for tribal programs.
Reference Category References Comments
Program References
http://www.epa.gov/ttn/atw/nata/index.html
Pollutants Measured
http://www.epa.gov/ttn/atw/nata/34poll.html
33 air pollutants (see link)
Methods References
Network Design References
Siting Criteria
Quality System References
Data Management
References
QA Handbook Volume II, Appendix B
Revision No. 0
Date:05/13
Page 1 of 4
Appendix B
Ambient Air Monitoring Quality Assurance Information and
Web Addresses
The following information provides key guidance documents and reports that can
be found on various sites within the Ambient Monitoring Technical Information
Center (AMTIC) Website. The following identifiers are used to describe national
ambient air monitoring programs
SLAMS- State or Local Air Monitoring Stations Network
NCore- National Core Network
PAMS - Photochemical Assessment Monitoring Stations
CSN PM
2.5
Chemical Speciation Network
NATTS- National Toxics Trends Network
SLAMS-NPAP- National Performance Audit Program
SLAMS-PEP- National PM2.5 Performance Evaluation Program
QA Handbook Volume II, Appendix B
Revision No. 0
Date:05/13
Page 2 of 4
Page intentionally left blank
Ambient Air Quality Assurance Information
Identifier Title EPA Number Pub Date Year URL
CSN Particulate Matter (PM2.5) Speciation Guidance Document
1999 http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/specfinl.pdf
NATTS NATTS Technical Assistance Document (TAD)
2009
http://www.epa.gov/ttn/amtic/files/ambient/airtox/nattsTADRevision2_508
Compliant.pdf
NCore NCore Technical Assistance Document (TAD)
2005
http://www.epa.gov/ttn/amtic/files/ambient/monitorstrat/precursor/tadversi
on4.pdf
NCore QA Handbook for Air Pollution Measurement Systems Volume IV
Meteorlogical Measurment Systems
EPA-454/B-08-002 2008
http://www.epa.gov/ttn/amtic/files/ambient/met/Volume%20IV_Meteorolo
gical_Measurements.pdf
PAMS Technical Assistance Document (TAD) for Sampling and Analysis
of Ozone Precursors; EPA/600-R-98/161 1998 http://www.epa.gov/ttn/amtic/files/ambient/pams/newtad.pdf
SLAMS QA Handbook for Air Pollution Measurement Systems Volume II
Ambient Air Quality Monitoring Program EPA-454/B-13-003 2013 http://www.epa.gov/ttn/amtic/qalist.html
SLAMS Guideline on the Meaning and the Use of Precision and Bias Data
Required by 40 CFR Part 58 Appendix A EPA-454/B-07-001 2007
http://www.epa.gov/ttn/amtic/files/ambient/qaqc/P&B%20Guidance%201
0.10.07%20vers1.1.pdf
SLAMS Transfer Standards for the Calibration of Air Monitoring Analyzers
for Ozone EPA-454/B-10-001 2010
http://www.epa.gov/ttn/amtic/files/ambient/qaqc/OzoneTransferStandard
Guidance.pdf
SLAMS PM2.5 PM
2.5
Quality Assurance Program Overview
1997 http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/pm25qa.pdf
CSN PM 2.5 Speciation Lab Audit Reports and Assessments Various Years http://www.epa.gov/ttn/amtic/pmspec.html
NATTS National Air Toxics Trends Stations Quality Assurance Annual
Reports and Proficiency Reports Various Years http://www.epa.gov/ttn/amtic/airtoxqa.html
SLAMS Annual Precision, Bias and Completeness Reports for Criteria
Pollutants Various Years http://www.epa.gov/ttn/amtic/parslist.html
PAMS PAMS Data Analysis and Reports Various Years http://www.epa.gov/ttn/amtic/pamsdata.html
SLAMS-PM2.5 3-Year and Annual QA Reports Various Years http://www.epa.gov/ttn/amtic/anlqa.html
SLAMS AA-PGVP Annual Reports Various Years http://www.epa.gov/ttn/amtic/aapgvp.html
SLAMS-PEP Laboratory Comparison Study of Gravimetric Laboratories Various Years http://www.epa.gov/ttn/amtic/pmpep.html
Methods
CSN Speciation Field Guidance Documents Various Years http://www.epa.gov/ttn/amtic/specguid.html
NATTS Air Toxics Methods- Various Methods 2007 http://www.epa.gov/ttn/amtic/airtox.html
NCore NCore Training Videos Various Years http://www.epa.gov/ttn/amtic/qalist.html
SLAMS QA Handbook Vol II (DRAFT Procedure for the "Determination of
Ozone By Ultraviolet Analysis") 1998 http://www.epa.gov/ttn/amtic/files/ambient/qaqc/ozone4.pdf
SLAMS Sec. 2.10 of QA Handbook - Draft - PM10- Dichot revised to local
standard and pressure EPA-600/4-77-027a 1997 http://www.epa.gov/ttn/amtic/files/ambient/qaqc/2-10meth.pdf
SLAMS Sec. 2.11 of QA Handbook - Draft - PM10 Hi Vol revised to local
standard and pressure 1997 http://www.epa.gov/ttn/amtic/files/ambient/qaqc/2-11meth.pdf
SLAMS Section 2.3 -- DRAFT - Reference Method for the Determination of
Nitrogen Dioxide in the Atmosphere (Chemiluminescence)
2002 http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/no2.pdf
SLAMS-NPAP DRAFT SOP for Through-the-Probe Performance Evaluations of
Ambient Air Quality Monitoring of Criteria Air Pollutants
2007 http://www.epa.gov/ttn/amtic/npapsop.html
SLAMS-PEP Method Compendium "Field Standard Operating Procedures for
the PM
2.5
Performance Evaluation Program"
2009 http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/PEP_Field_SOP.pdf
GUIDANCE DOCUMENTS
QA REPORTS
3
Ambient Air Quality Assurance Information
Identifier Title EPA Number Pub Date Year URL
SLAMS-PEP Method Compendium "PM
2.5
Mass Weighing Laboratory Standard
Operating Procedures for the Performance Evaluation Program
1998 http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/peplsop.pdf
SLAMS-PM2.5 2.12 "Monitoring PM
2.5
in Ambient Air Using Designated Reference
or Class I Equivalent Methods"
1998 http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/m212covd.pdf
SLAM-Pb Approved Equivalent Methods Various Years http://www.epa.gov/ttn/amtic/pb-monitoring.html
SLAM-Pb RTI Procedure for the development of Pb Analysis Audits (TSP)
2010 http://www.epa.gov/ttn/amtic/files/ambient/pb/rtipbauditstrip2010.pdf
SLAM-Pb RTI Procedure for the development of Pb Analysis Audits (Teflon
for ICP-MS) 2012 http://www.epa.gov/ttn/amtic/files/ambient/pb/rtipbauditsopteflon.pdf
SLAMS-Pb MO Procedure for the development of Pb Analysis Audits (TSP)
2009 http://www.epa.gov/ttn/amtic/files/ambient/pb/MOAuditStripMethod.pdf
CSN Speciation Laboratory Standard Operating Procedures
Various Years http://www.epa.gov/ttn/amtic/specsop.html
CSN Quality Chemical Speciation Network QAPP for NCore and
Supplemental Sites
EPA-454/B-12-003 2012
http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/CSN_QAPP_v120
_05-2012.pdf
CSN Quality Management Plan for the PM
2.5
Speciation Trends Network
EPA-454/R-01-009 2001 http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/finlqmp.pdf
NATTS Model Quality Assurance Project Plan for the National Air Toxics
Trends Stations - updated version 1.1 2007
http://www.epa.gov/ttn/amtic/files/ambient/airtox/NATTS_Model_QAPP.p
df
NATTS Model QAPP for Local-Scale Monitoring Projects" EPA-454/R-01-007 2006 http://www.epa.gov/ttn/amtic/files/ambient/airtox/pilotqapp.pdf
NATTS National Air Toxics Trends Stations - Quality Management Plan
Final 2005 http://www.epa.gov/ttn/amtic/files/ambient/airtox/nattsqmp.pdf
PAMS PAMS Implementation Manual EPA-454/B-93-051 1994 http://www.epa.gov/ttn/amtic/files/ambient/pams/b93-051a.pdf
SLAMS Quality Assurance Project Plan for the Audit Support Program -
NPAP and NATTS 2008
http://www.epa.gov/ttn/amtic/files/ambient/qaqc/NPAPQAPPrvsn071007
onforTTP.pdf
SLAMS PM2.5 PM
2.5
Model QA Project Plan (QAPP)"
EPA-454/R-98-005 1998 http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/totdoc.pdf
SLAMS PM2.5 PM2.5 FRM Network Federal Performance Evaluation Program
Quality Assurance Project Plan (QAPP) 2007
http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/pepqapp_DRAFT_1
2-2007_cmt_vrsn.pdf
SLAMS PM2.5 PM
2.5
Performance Evaluation Program Implementaion Plan
1998 http://www.epa.gov/ttn/amtic/files/ambient/pm25/qa/pep-ip.pdf
AA-PGVP Ambient Air Protocal Gas Verification Program QAPP
2010
http://www.epa.gov/ttn/amtic/files/ambient/qaqc/pgvp-qapp-
march2010v2.pdf
AA-PGVP Ambient Air Protocal Gas Verification Program Implementation
Plan 2010 http://www.epa.gov/ttn/amtic/files/ambient/qaqc/aapgvpimpplan.pdf
CSN Current List of CSN Sites as of 07-11-2007 2013 http://www.epa.gov/ttn/amtic/specgen.html
CSN Modification of Carbon Procedures in the Speciation Network;
Overview and Frequently Asked Questions 2006 http://www.epa.gov/ttn/amtic/files/ambient/pm25/spec/faqcarbon.pdf
SLAMS Training and Conferences Various Years http://www.epa.gov/ttn/amtic/training.html
SLAMS QA Newsletters Various Years http://www.epa.gov/ttn/amtic/qanews.html
IMPLEMENTATION PLANS and QAPPs
WHITE PAPERS/IMPORTANT MEMOS
4
QA Handbook Volume II, Appendix C
Revision No. 0
Date:05/13
Page 1 of 7
Appendix C
Using the Graded Approach for the Development of QMPs and
QAPPs in Ambient Air Quality Monitoring Programs
NOTE: As of the date of this Handbook publication the EPA Quality Staff is revising some
of the requirements for QAPPs and QMPs. Please visit the Quality Staffs website for
updates on these documents (http://www.epa.gov/quality1/)
QA Handbook Volume II, Appendix C
Revision No. 1
Date:12/08
Page 2 of 7
This page intentionally left blank
QA Handbook Volume II, Appendix C
Revision No. 0
Date:05/13
Page 3 of 7
Using the Graded Approach for the Development of QMPs and QAPPs in Ambient Air
Quality Monitoring Programs
EPA policy requires that all organizations funded by EPA for environmental data operations
(EDOs) develop quality management plans (QMPs) and quality assurance project plans
(QAPPs). In addition, EPA has provided flexibility to EPA organizations on how they implement
this policy, allowing for use of a graded approach. The following proposal explains the graded
approach for data collection activities related to ambient air monitoring. OAQPS proposes a
graded approach for the development of QAPPs and QMPs.
The Graded Approach
The QMP describes the quality system in terms of the organizational structure, functional
responsibilities of management and staff, lines of authority, and required interfaces for those
planning, implementing, and assessing activities involving EDOs. Each program should provide
appropriate documentation of their quality system. Here are a few ways that this could be
handled.
Concept - Small organizations may have limited ability to develop and implement a quality
system. EPA should provide options for those who are capable of making progress towards
developing a quality system. If it is clear that the EDO goals are understood and that progress in
quality system development is being made, a non-optimal quality system structure, for the
interim, is acceptable. The concept is to work with the small organization to view the QMP as a
long-term strategic plan with an open ended approach to quality system development that will
involve continuous improvement. The graded approach to QMP development is described below
and is based on the size of the organization and experience in working with EPA and the
associated QA requirements.
1. Small organization that just received its first EPA grant or using a grant for a discrete,
small, project-level EDO. Such organizations could incorporate a description of its
quality system into its QAPP.
2. Small organization implementing EDOs with EPA at more frequent intervals or
implementing long-term monitoring programs with EPA funds. If such an organization
demonstrates capability of developing and implementing a stand-alone quality system, it
is suggested that an appropriate separate QMP be written.
3. Medium or large organization. Develop QMP to describe its quality system and QAPPs
for specific EDOs. Approval of the recipient's QMP by the EPA Project Officer and the
EPA Quality Assurance Manager may allow delegation of the authority to review and
approve Quality Assurance Project Plans (QAPPs) to the grant recipient based on
acceptable procedures documented in the QMP.
QA Handbook Volume II, Appendix C
Revision No. 1
Date:12/08
Page 4 of 7
Quality Assurance Project Plans
The QAPP is a formal document describing, in comprehensive detail, the necessary QA/QC and
other technical activities that must be implemented to ensure that the results of work performed
will satisfy the stated performance criteria, which may be in the form of a data quality objective
(DQO). The quality assurance policy of the EPA requires every EDO to have written and
approved quality assurance project plans (QAPPs) prior to the start of the EDO. It is the
responsibility of the EPA Project Officer (person responsible for the technical work on the
project) to adhere to this policy. If the Project Officer gives permission to proceed without an
approved QAPP, he/she assumes all responsibility. If a grantees QMP is approved by EPA and
provides for delegation of QAPP approval to the grantee, the grantee is responsible to ensuring
approval of the QAPP prior to the start of the EDO.
The Ambient Air Monitoring Program recommends a four-tiered project category approach to
the Ambient Air QA Program in order to effectively focus QA. Category I involves the most
stringent QA approach, utilizing all QAPP elements as described in EPA R5
a
(see Table 2),
whereas category IV is the least stringent, utilizing fewer elements. In addition, the amount of
detail or specificity required for each element will be less as one moves from category I to IV.
Table 1 provides information that helps to define the categories of QAPPs based upon the data
collection objective. Each type of ambient air monitoring program EDO will be associated with
one of these categories. The comment area of the table will identify whether QMPs and QAPPs
can be combined and the type of data quality objectives (DQOs) required (see below). Table 2
identifies which of the 24 QAPP elements are required for each category of QAPP. Based upon
a specific project, the QAPP approving authority may add/delete elements for a particular
category as it relates to the project but in general, this table will be applicable based on the
category of QAPP.
Flexibility on the systematic planning process and DQO development
Table 1 describes 4 QAPP/QMP categories which require some type of statement about the
program or project objectives. Three of the categories use the term data quality objectives
(DQOs), but there should be flexibility with the systematic planning process on how these DQOs
are developed based on the particular category. For example, a category 1 project would have
formal DQOs. Examples of category I projects, such as the State and Local Monitoring Stations
(SLAMS), have DQOs developed by OAQPS. Category II QAPPS may have formal DQOs
developed if there are national implications to the data (i.e., Speciation Trends Network) or less
formal DQOs if developed by organizations implementing important projects that are more local
in scope. Categories 3 and 4 would require less formal DQOs to a point that only project goals
(category 4) may be necessary.
a
EPA Requirements for QA Project Plans (QA/R-5) http://www.epa.gov/quality/qa_docs.html
QA Handbook Volume II, Appendix C
Revision No. 0
Date:05/13
Page 5 of 7
Standard Operating Procedures- (SOP)
SOPs are an integral part of the QAPP development and approval process and usually address
key information required by the QAPP elements. Therefore, SOPs can be referenced in QAPP
elements as long as the SOPs are available for review or are part of the QAPP.
QA Handbook Volume II, Appendix C
Revision No. 1
Date:12/08
Page 6 of 7
QA Handbook Volume II, Appendix C
Revision No. 0
Date:05/13
Page 7 of 7
Table 2 QAPP Elements
QAPP Element Category
Applicability
A1 Title and Approval Sheet
A2 Table of Contents
A3 Distribution List
A4 Project/Task Organization
A5 Problem Definition/Background
A6 Project/Task Description
A7 Quality Objectives and Criteria for Measurement Data
A8 Special Training Requirements/Certification
A9 Documentation and Records
B1 Sample Process (Network) Design
B2 Sampling Methods Requirements
B3 Sample Handling and Custody Requirements
B4 Analytical Methods Requirements
B5 Quality Control Requirements
B6 Instrument/Equipment Testing, Inspection & Maintenance
B7 Instrument Calibration and Frequency
B8 Inspection/Acceptance Requirements for Supplies and Con.
B9 Data Acquisition Requirements for Non-direct Measurements
B10 Data Management
C1 Assessments and Response Actions
C2 Reports to Management
D1 Data Review, Validation, and Verification Requirements
D2 Validation and Verification Methods
D3 Reconciliation and User Requirements
I, II, III, IV
I, II, III
I,
I, II, III
I, II, III
I, II, III, IV
I, II, III, IV
I
I, II, III
I, II, III, IV
I, II, III,
I, II, III
I, II, III, IV
I, II, III, IV
I, II, III
I, II, III
I,
I, II, III
I, II
I, II,
I, II,
I, II, III
I, II
I, II,
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 1 of 48
Appendix D
Measurement Quality Objectives and Validation Templates
Table of Contents
(click on link to go to individual tables)
Validation Template Page
O
3
4
CO 7
NO
2
, NOx, NO 10
SO
2
13
PM
2.5
Filter Based Local Conditions 16
Continuous PM2.5 Local Conditions 21
PM10c for PM
10-2.5
Low Volume , Filter-Based Local Conditions 24
PM
10
Filter Based Dichot STP Conditions 29
PM
10
Filter Based High Volume (HV) STP Conditions 32
Continuos PM10 STP Conditions 35
PM
10
Low Volume STP Filter-Based Local Conditions 37
Pb High Volume (TSP) 42
Pb Low Volume (PM
10
) 46
NOTE: There is a potential that information on the validation templates have been changed.
They are posted here for reference purposes. However the user is directed to the AMTIC
website.
http://www.epa.gov/ttn/amtic/qalist.html
The attached validation tables are found there as well as a table that is updated with any change
occurring after the publication date of the Handbook
In addition, at the time of publication NCore validation templates were being reviewed and
refined. When completed they will be posted on the AMTIC Website listed above.
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 2 of 48
In J une 1998, a workgroup was formed to develop a procedure that could be used by State and
locals that would provide for a consistent validation of PM
2.5
mass concentrations across the US.
The workgroup included personnel from the monitoring organizations, EPA Regional Offices,
and OAQPS who are involved with assuring the quality of PM
2.5
mass and was headed by a State
and local representative. The workgroup developed three tables of criteria where each table has
a different degree of implication about the quality of the data. The criteria included on the tables
are from 40 CFR Part 50 Appendices L and N, 40 CFR Part 58 Appendix A, Method 2.12, and a
few criteria that were neither in CFR nor Method 2.12 but which the workgroup felt should be
included. Upon completion and use of the table, it was decided that a validation template
should be developed for all the criteria pollutants.
One of the tables has the criteria that the workgroup felt must be met to ensure the quality of the
data. An example criterion for PM
2.5
is that the average flow rate for the sampling period must
be maintained to within 5% of 16.67 liters per minute. The second table has the criteria that
indicate that there might be a problem with the quality of the data and further investigation is
warranted before making a determination about the validity of the sample or samples. An
example criterion is that the field filter blanks should not change weight by more than 30
micrograms between weighings. The third table has criteria that indicate a potentially systematic
problem with the environmental data collection activity. Such systematic problems may impact
the ability to make decisions with the data. An example criterion is that at least 75% of the
scheduled samples for each quarter should be successfully collected and validated.
To determine the appropriate table for each criterion, the members of the workgroup considered
how significantly the criterion impacted the resulting concentration. This was based on
experience from workgroup members, experience from non-workgroup members, and feasibility
of implementing the criterion.
Criteria that were deemed critical to maintaining the integrity of a sample or group of samples
were placed on the first table. Observations that do not meet each and every criterion on the
Critical Criteria Table should be invalidated unless there are compelling reason and
justification for not doing so. The sample or group of samples for which one or more of these
criteria are not met is invalid until proven otherwise. The cause of not operating in the
acceptable range for each of the violated criteria must be investigated and minimized to reduce
the likelihood that additional samples will be invalidated.
Criteria that are important for maintaining and evaluating the quality of the data collection
system are included on the second table, the Operational Evaluations Table. Violation of a
criterion or a number of criteria may be cause for invalidation. The decision maker should
consider other quality control information that may or may not indicate the data are acceptable
for the parameter being controlled. Therefore, the sample or group of samples for which one or
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 3 of 48
more of these criteria are not met is suspect unless other quality control information
demonstrates otherwise. The reason for not meeting the criteria MUST be investigated,
mitigated or justified.
Finally, those criteria which are important for the correct interpretation of the data but do not
usually impact the validity of a sample or group of samples are included on the third table, the
Systematic Issues Table. For example, the data quality objectives are included in this table. If
the data quality objectives are not met, this does not invalidate any of the samples but it may
impact the error rate associated with the attainment/non-attainment decision.
Please note the designation Operational or Systematic do not imply that these quality
control checks need not be performed. If an operational or systematic quality control check
that is required by regulation is not performed that can be a basis for invalidation of all
associated data.
Following are the tables for all the criteria pollutants. For each criterion, the tables include: (1)
the requirement (2) the frequency with which compliance is to be evaluated, (3) acceptance
criteria, and (4) information where the requirement can be found or additional guidance on the
requirement.
The validation templates have been developed based on the current state of knowledge. The
templates should evolve as new information is discovered about the impact of the various criteria
on the error in the resulting mass estimate or concentration. Due to the potential misuse of
invalid data, data that are invalidated will not be uploaded to AQS but should be retained on the
monitoring organizations local database. This data will be invaluable to the evolution of the
validation template.
Use of Bold Italics Font to Identify CFR Requirements.
The criteria listed in the validation templates are either requirements that can be found in the
Code of Federal Regulations, guidance found in a variety of guidance documents, or
recommendations by the QA Workgroup or EPA. Any time a CFR requirement is identified in
the Requirement, Frequency or Acceptance Criteria column it will be identified by bold and
italics font. The Information/Action column will provide the appropriate references for CFR or
guidance documents.
Hyperlink References
Where requirements or guidance documents are found on the web, a hyperlink is created which
will lead the user to the closest URL address. Any links to CFR are directed to the electronic
CFR document (e-CFR) which is the most up-to-date. E-CFR will not get you to an individual
section. Therefore e-CFR is only hyperlinked once on each page.
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 4 of 48
PM
10
Note of Caution
The validation templates for PM
10
get complicated because PM
10
is required to be reported at
standard temperature and pressure (STP) for comparison to the NAAQS (and follow 40 CFR Part
50 App J ) and at local conditions if using it to monitor for PM
10-2.5
(and follow 40 CFR Part 50
App O). Moreover, PM
10
can be measured with filter-based sampling techniques as well as
with automated methods. The validation templates developed for PM
10
try to accommodate
these differences, but monitoring organizations are cautioned to review the operations manual for
the monitors/samplers they use and augment the validation template with QC information
specific to their EPA reference or equivalent method designation and instrument.
http://www.epa.gov/ttn/amtic/files/ambient/criteria/reference-equivalent-methods-list.pdf
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 5 of 48
Ozone Validation Template
1) Requirement (O
3
) 2) Frequency 3) Acceptance Criteria Information /Action
CRITICAL CRITERIA-OZONE
One Point QC Check
Single analyzer
1/ 2 weeks <+7% (percent difference)
1 and 2) 40 CFR Part 58 App A Sec 3.2
3) Recommendation based on DQO in 40 CFR Part 58
App A Sec 2.3.1.2. QC Check Conc range 0.01 - 0.10 ppm,
relative to routine concentrations
Zero/span check
1/ 2 weeks
Zero drift <+1.5 ppb
Span drift <+7 %
1 and 2) QA Handbook Volume 2 Section 12.3
3) Recommendation and related to DQO
OPERATIONAL CRITERIA -OZONE
Shelter Temperature Range
Daily
(hourly values)
20 to 30
o
C. (Hourly avg)
or
per manufacturers specifications if designated
to a wider temperature range
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Generally the 20-30
o
C range will apply but the most
restrictive operable range of the instruments in the shelter
may also be used as guidance. FRM/FEM list found on
AMTIC provides temp. range for given instrument.
FRM/FEM monitor testing is required at 20-30
o
C range
per 40 CFR Part 53.32
Shelter Temperature Control Daily (hourly values) <+2
o
C SD over 24 hours 1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Shelter Temperature Device
Check
1/6 mo
+2
o
C of standard
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Annual Performance
Evaluation Single analyzer
Every site 1/year within period of
monitor operation, 25 % of sites
quarterly
Percent difference of audit levels 3-10 < +15%
Audit levels 1&2 +1.5 ppb difference or
+15%
1 and 2) 40 CFR Part 58 App A sec 3.2.2
3) Recommendation- 3 audit concentrations not including
zero. AMTIC guidance 2/17/2011
http://www.epa.gov/ttn/amtic/cpreldoc.html
Federal Audits (NPAP)
1/year at selected sites 20% of sites
audited
Audit levels 1&2 +1.5 ppb difference all other
levels percent difference +10%
1) 40 CFR Part 58 App A sec 2.4
2) NPAP adequacy requirements on AMTIC
3) NPAP QAPP/SOP
Verification/Calibration
Upon receipt/adjustment/repair/
installation/moving and repair and
recalibration of standard of higher
level
1/6 months if manual zero/span
performed biweekly
1/year if continuous zero/span
performed daily
All points within + 2 % of calibration range of
best-fit straight line
Linearity error <5%
1) 40 CFR Part 50 App D
2) Recommendation
3) Recommendation- Linearity error 40 CFR Part 50 App
D
Multi-point calibration (0 and 4 upscale points) 40 CFR
Part 50 App D sec 5.2.3
Zero Air/Zero Air Check 1/year
Concentrations below LDL
1) 40 CFR Part 50 App D Section 4.1
2 and 3) Recommendation
Ozone Level 2 Standard
Certification/recertification to
Standard Reference
Photometer (Level 1)
1/year
single point difference <+3%
1) 40 CFR Part 50 App D Section 5.4
2 and 3) Transfer Standard Guidance EPA-454/B-10-001
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 6 of 48
1) Requirement (O
3
) 2) Frequency 3) Acceptance Criteria Information /Action
Level 2 standard (formerly called primary standard)
usually transported to EPA Regions SRP for comparison
Level 2 and Greater Transfer
Standard Precision
1/year
Standard Deviation less than 0.005 ppm or 3%
whichever is greater
1) 40 CFR Part 50 Appendix D Sec 3.1
2) Recommendation, part of reverification
3) 40 CFR Part 50 Appendix D Sec 3.1
(if recertified via a transfer
standard)
1/year
Regression slopes =1.00 +0.03 and two
intercepts are 0 +3 ppb
1, 2 and 3) Transfer Standard Guidance EPA-545/B-10-
001
Ozone Transfer standard
(Level 3 and greater)
Qualification Upon receipt of transfer standard
+4% or +4 ppb (whichever greater)
1, 2 and 3) Transfer Standard Guidance EPA-545/B-10-
001
Certification
After qualification and upon
receipt/adjustment/repair
RSD of six slopes <3.7%
Std. Dev. of 6 intercepts 1.5
1, 2 and 3) Transfer Standard Guidance EPA-545/B-10-
001 1
Recertification to higher level
standard
Beginning and end of O3 season or
1/6 months whichever less
New slope =+0.05 of previous and
RSD of six slopes <3.7%
Std. Dev. of 6 intercepts 1.5
1, 2 and 3) Transfer Standard Guidance EPA-545/B-10-
001 recertification test that then gets added to most recent
5 tests. If does not meet acceptability certification fails
Detection (FEM/FRMs)
Noise
Upon receipt/adjustment/repair/
installation/moving and repair and
recalibration or 1/year
< 0.005 ppm
1) 40 CFR Part 53.23 (b) (definition & procedure)
2) NA
3) 40 CFR Part 53.20 Table B-1
Lower detectable level 1/year 0.01 ppm
1) 40 CFR Part 53.23 (b) (definition & procedure)
2) Recommendation
3) 40 CFR Part 53.20 Table B-1
SYSTEMATIC CRITERIA-OZONE
Sampler/Monitor/ Transfer and
Calibration Standard
NA
Meets requirements listed in FRM/FEM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Standard Reporting Units All data ppm (final units in AQS) 1, 2 and 3) 40 CFR Part 50 App I sec 2.1.1
Rounding convention for data
reported to AQS
All data
3 places after decimal with digits to right
truncated
1, 2 and 3) 40 CFR Part 50 App I sec 2.1.1
Completeness (seasonal)
3-Year Comparison
> 90% (avg) daily max available in ozone
season with min of 75% in any one year.
1) 40 CFR Part 50 App I
2) 40 CFR Part 50 App I Section 2.3
3) 40 CFR Part 50 App I Section 2..3 (b)
8- hour average >75% of hourly averages for the 8-hour
1) 40 CFR Part 50 App I
2 and 3) 40 CFR Part 50 App I Section 2.1.1
Valid Daily Max
> 75% of the 24, 8 hour averages (18 of 24 8-
hour averages
1) 40 CFR Part 50 App I
2) 40 CFR Part 50 App I Section 2.1.2
3) 40 CFR Part 50 App I Section 2.1.2 (b)
Sample Residence Time
Verification
1/year < 20 seconds
1) 40 CFR Part 58 App E, section 9 (c)
2) Recommendation
3) 40 CFR Part 58 App E, section 9 (c)
Sample Probe, Inlet, Sampling All sites Borosilicate glass (e.g., Pyrex
) or Teflon
1) 40 CFR Part 58 App E, section sec 9 (a)
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 7 of 48
1) Requirement (O
3
) 2) Frequency 3) Acceptance Criteria Information /Action
train 2) Recommendation
3) 40 CFR Part 58 App E, section sec 9 (a)
FEP and PFA have been accepted as a equivalent material
to Teflon. Replacement or cleaning is suggested as 1/year
and more frequent if pollutant load or contamination
dictate
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-6
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-6
EPA Standard Ozone Reference
Photometer (SRP)
Recertification (Level 1)
1/year
Regression slope =1.00 +0.01
and intercept <3 ppb
1,2 and 3) ) Transfer Standard Guidance EPA-454/B-10-
001
This is usually at a Regional Office and is compared
against the traveling SRP
Precision(using 1-point QC
checks)
Calculated annually and as
appropriate for design value
estimates
90% CL CV < 7%
1) 40 CFR Part 58 App A 2.3.1.2 & 3.2.1
2) 40 CFR Part 58 App A sec 4 (b)
3) 40 CFR Part 58 App A sec 4.1.2
Bias (using 1-point QC checks)
Calculated annually and as
appropriate for design value
estimates
95% CL < + 7%
1) 40 CFR Part 58 App A 2.3.1.2 & 3.2.1
2) 40 CFR Part 58 App A sec 4 (b)
3) 40 CFR Part 58 App A sec 4.1.3
Annual PE Primary QA
Organization (PQAO)
Evaluation
1/year
95% of audit percent differences fall within
the one point QC check 95% probability
intervals at PQAO level of aggregation
1) 40 CFR Part 58 App A Section 3.2.2
2) Recommendation
3) 40 CFR Part 58 App A sec 4.1.4 & 4.1.5
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 8 of 48
CO Validation Template
1) Requirement (CO) 2) Frequency 3) Acceptance Criteria Information /Action
CRITICAL CRITERIA-CO
One Point QC Check
Single analyzer
1/ 2 weeks <+10% (percent difference)
1 and 2) 40 CFR Part 58 App A Sec 3.2
3) Recommendation based on DQO in 40 CFR Part 58
App A Sec 2.3.1. QC Check Conc range 1 - 10 ppm
relative to routine concentrations
Zero/span check
1/ 2 weeks
Zero drift <+ 0.03 ppm
Span drift <+ 10 %
1 and 2) QA Handbook Volume 2 Section 12.3
3) Recommendation
OPERATIONAL CRITERIA-CO
Shelter Temperature range
Daily
(hourly values)
20 to 30
o
C. (Hourly avg)
or
per manufacturers specifications if designated to
a wider temperature range
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Generally the 20-30
o
C range will apply but the most
restrictive operable range of the instruments in the
shelter may also be used as guidance. FRM/FEM list
found on AMTIC provides temp. range for given
instrument. FRM/FEM monitor testing is required at
20-30
o
C range per 40 CFR Part 53.32
Shelter Temperature Control Daily (hourly values) <+ 2
o
C SD over 24 hours
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Shelter Temperature Device
Check
1/6 mo +2
o
C of standard
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Annual Performance
Evaluation Single Analyzer
Every site 1/year 25 % of sites
quarterly
Percent difference of audit levels 3-10 < +15%
Audit levels 1&2 +0.03 ppm difference or +15%
1 and 2) 40 CFR Part 58 App A sec 3.2.2
3) Recommendation- 3 audit concentrations not
including zero. AMTIC guidance 2/17/2011
http://www.epa.gov/ttn/amtic/cpreldoc.html
Federal Audits (NPAP)
1/year at selected sites 20% of sites
audited
Audit levels 1&2 +0.03 ppm difference all other
levels percent difference +15%
1) 40 CFR Part 58 App A sec 2.4
2) NPAP adequacy requirements on AMTIC
3) NPAP QAPP/SOP
Verification/Calibration
Upon receipt/adjustment/repair/
installation/moving
1/6 months if manual zero/span
performed biweekly
1/year if continuous zero/span
performed daily
All points within +2 % of calibration range of
best-fit straight line
1) 40 CFR Part 50 Appendix C Section 4
2 and 3) Recommendation
See details about CO2 sensitive instruments Multi-point
calibration (0 and 4 upscale points)
Gaseous Standards All gas cylinders
NIST Traceable
(e.g., EPA Protocol Gas)
1) 40 CFR Part 50 Appendix C Section 4.3.1
2) NA Green book
3) 40 CFR Part 50 Appendix C Section 4.3.1 See details
about CO2 sensitive instruments
Gas producer used must participate in EPA Ambient Air
Protocol Gas Verification Program
40 CFR Part 58 App A sec 2.6.1
Zero Air/Zero Air Check 1/year < 0.1 ppm CO
1) 40 CFR Part 50 App C Section 4.3.2
2) Recommendation
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 9 of 48
1) Requirement (CO) 2) Frequency 3) Acceptance Criteria Information /Action
3) 40 CFR Part 50 App C Section 4.3.2
Gas Dilution Systems
1/year or after failure of 1 point QC
check or performance evaluation
Accuracy +2 %
1,2 and 3) Recommendation based on SO2 requirement
in 40 CFR Part 50 App A-1 Sec 4.1.2
Detection (FEM/FRMs)
Noise 1/year
0.2 ppm (standard range)
0.1 ppm (lower range)
1) 40 CFR Part 53.23 (b) (definition & procedure)
2) Recommendation- info obtained from LDL
3) 40 CFR Part 53.20 Table B-1
Lower detectable level 1/year
0.4 ppm(standard range)
0.2 ppm (lower range)
1) 40 CFR Part 53.23 (c) (definition & procedure)
2) Recommendation
3) 40 CFR Part 53.20 Table B-1
SYSTEMATIC CRITERIA-CO
Sampler/Monitor NA
Meets requirements listed in FRM/FEM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Standard Reporting Units All data ppm (final units in AQS) 1, 2 and 3) ) 40 CFR Part 50.8 (a)
Rounding convention for data
reported to AQS
All data 1 decimal place
1, 2 and 3) 40 CFR Part 50.8 (d) (for averaging values
for comparison to NAAQS not for reporting individual
hourly values.)
Completeness 8-hour standard 75% of hourly averages for the 8-hour period
1) 40 CFR Part 50.8(c)
2) 40 CFR Part 50.8(a-2)
3) 40 CFR Part 50.8(c)
Sample Residence Time
Verification
1/year <20 seconds
1,2, and 3) Recommendation. CO not a reactive gas but
suggest following same methods other gaseous criteria
pollutants.
Sample Probe, Inlet, Sampling
train
All Sites
Borosilicate glass (e.g., Pyrex
) or Teflon
1,2, and 3) Recommendation. CO not a reactive gas but
suggest following same methods other gaseous criteria
pollutants. FEP and PFA have been accepted as a
equivalent material to Teflon. Replacement/cleaning is
suggested as 1/year and more frequent if pollutant load
dictate.
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-6
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-6
Precision(using 1-point QC
checks)
Calculated annually and as
appropriate for design value
estimates
90% CL CV < 10%
1) 40 CFR part 58 App A sec 3.2.1
2) 40 CFR Part 58 App A sec 4 (b)
3) 40 CFR Part 58 App A sec 4.1.2
Bias (using 1-point QC checks)
Calculated annually and as
appropriate for design value
estimates
95% CL < + 10%
1) 40 CFR Part 58 App A sec 3.2.1
2) 40 CFR Part 58 App A sec 4 (b)
3) 40 CFR Part 58 App A sec 4.1.3
Annual PE Primary QA
Organization (PQAO)
Evaluation
1/year
95% of audit percent differences fall within the
one point QC check 95% probability intervals at
PQAO level of aggregation
1) 40 CFR Part 58 App A Section 3.2.2
2) Recommendation
3) 40 CFR Part 58 App A sec 4.1.4 & 4.1.5
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 10 of 48
NO
2
, NOx, NO Validation Template
1) Requirement (NO
2
) 2) Frequency 3) Acceptance Criteria Information /Action
CRITICAL CRITERIA- NO
2
One Point QC Check
Single analyzer
1/ 2 weeks <+15% (percent difference)
1 and 2) 40 CFR Part 58 App A Sec 3.2
3) Recommendation based on DQO in 40 CFR Part 58
App A Sec 2.3.1.5
QC Check Conc range 0.01 - 0.10 ppm
Relative to routine concentrations
Zero/span check
1/ 2 weeks
Zero drift <+1.5 ppb
Span drift <+10 %
1 and 2) QA Handbook Volume 2 Section 12.3
3) Recommendation and related to DQO
Converter Efficiency
During multi-point calibrations, span and
audit
1/ 2 weeks
(>96%)
96% 104%
1) 40 CFR Part 50 App F Section 1.5.10 and 2.4.10
2) Recommendation
3) 40 CFR Part 50 App F Section 1.5.10 and 2.4.10
Regulation states >96%, 96 104% is a
recommendation.
OPERATIONAL CRITERIA- NO
2
Shelter Temperature Range
Daily
(hourly values)
20 to 30
o
C. (Hourly avg)
or
per manufacturers specifications if designated
to a wider temperature range
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Generally the 20-30
o
C range will apply but the most
restrictive operable range of the instruments in the
shelter may also be used as guidance. FRM/FEM list
found on AMTIC provides temp. range for given
instrument. FRM/FEM monitor testing is required at
20-30
o
C range per 40 CFR Part 53.32
Shelter Temperature Control Daily (hourly values) <+2
o
C SD over 24 hours
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Shelter Temperature Device
Check
1/6 mo +2
o
C of standard
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Annual Performance
Evaluation Single Analyzer
Every site 1/year 25 % of sites quarterly
Percent difference of audit levels 3-10 < +15%
Audit levels 1&2 +1.5 ppb difference or +15%
1) 40 CFR Part 58 App A sec 3.2.2
2) 40 CFR Part 58 App A sec 3.2.2
3) Recommendation - 3 audit concentrations not
including zero. AMTIC guidance 2/17/2011
http://www.epa.gov/ttn/amtic/cpreldoc.html
Federal Audits (NPAP)
1/year at selected sites 20% of sites
audited
Audit levels 1&2 +1.5 ppb difference all other
levels percent difference +15%
1) 40 CFR Part 58 App A sec 2.4
2) NPAP adequacy requirements on AMTIC
3) NPAP QAPP/SOP
Verification/Calibration
Upon receipt/adjustment/repair/
installation/moving
1/6 months if manual zero/span
performed biweekly
1/year if continuous zero/span performed
daily
Instrument residence time <2 min
Dynamic parameter >2.75 ppm-min
All points within +2 % of calibration range of
best-fit straight line
1) 40 CFR Part 50 App F
2 and 3) Recommendation
Multi-point calibration (0 and 4 upscale points)
Gaseous Standards All gas cylinders NIST Traceable 1) 40 CFR Part 50 App F Section 1.3.1
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 11 of 48
1) Requirement (NO
2
) 2) Frequency 3) Acceptance Criteria Information /Action
(e.g., EPA Protocol Gas)
50-100 ppm of NO in Nitrogen with <1 ppm
NO
2
2) NA Green book
3) 40 CFR Part 50 App F Section 1.3.1
Gas producer used must participate in EPAAmbient
Air Protocol Gas Verification Program 40 CFR Part
58 App A sec 2.6.1
Zero Air/ Zero Air Check 1/year Concentrations below LDL
1) 40 CFR Part 50 App F Section 1.3.2
2 and 3) Recommendation
Gas Dilution Systems
1/year or after failure of 1 point QC
check or performance evaluation
Accuracy +2 %
1,2 and 3) Recommendation based on SO2
requirement in 40 CFR Part 50 App A-1 Sec 4.1.2
Detection (FEM/FRMs)
Noise NA 0.005 ppm
1) 40 CFR Part 53.23 (b) (definition & procedure)
2) NA
3) 40 CFR Part 53.20 Table B-1
Lower detectable level 1/year 0.01 ppm
1) 40 CFR Part 53.23 (c) (definition & procedure)
2) Recommendation
3) 40 CFR Part 53.20 Table B-1
SYSTEMATIC CRITERIA- NO
2
Sampler/Monitor NA
Meets requirements listed in FRM/FEM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Standard Reporting Units All data ppb (final units in AQS) 1,2 and 3) 40 CFR Part 50 App S Sec 2 (c)
Rounding convention for data
reported to AQ S
All data
1 place after decimal with digits to right
truncated
1, 2 and 3) 40 CFR Part 50 App S Sec 4.2 (a)
Completeness
Annual Standard
75% hours in year
1) 40 CFR Part 50 App S sec 3.1(b)
2) 40 CFR Part 50 App S sec 3.1(a)
3) 40 CFR Part 50 App S sec 3.1(b)
1-hour standard
1) 3consecutive calendars years of complete
data
2) 4 quarters complete in each year
3) 75% sampling days in quarter
4) 75% of hours in a day
1) 40 CFR Part 50 App S sec 3.2(b)
2) 40 CFR Part 50 App S sec 3.2(a)
3) 40 CFR Part 50 App S sec 3.2(b)
More details in 40 CFR Part 50 App S
Sample Residence Time
Verification
1/year < 20 seconds
1) 40 CFR Part 58 App E, section 9 (c)
2) Recommendation
3) 40 CFR Part 58 App E, section 9 (c)
Sample Probe, Inlet, Sampling
train
All sites
Borosilicate glass (e.g., Pyrex
) or Teflon
1, 2 and 3) 40 CFR Part 58 App E sec 9 (a)
FEP and PFA have been accepted as equivalent
material to Teflon. Replacement or cleaning is
suggested as 1/year and more frequent if pollutant
load or contamination dictate
Siting 1/year Meets siting criteria or waiver documented 1) 40 CFR Part 58 App E, sections 2-6
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 12 of 48
1) Requirement (NO
2
) 2) Frequency 3) Acceptance Criteria Information /Action
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-6
Precision(using 1-point QC
checks)
Calculated annually and as appropriate
for design value estimates
90% CL CV < 15%
1) 40 CFR Part 58 App A sec 2.3.1.5 & 3.2.1
2) 40 CFR Part 58 App A sec 4 (b)
3) 40 CFR Part 58 App A sec 4.1.2
Bias (using 1-point QC checks)
Calculated annually and as appropriate
for design value estimates
95% CL < + 15%
1) 40 CFR Part 58 App A sec 2.3.1.5 & 3.2.1
2) 40 CFR Part 58 App A sec 4 (b)
3) 40 CFR Part 58 App A sec 4.1.3
Annual PE Primary QA
Organization (PQAO)
Evaluation
1/year
95% of audit percent differences fall within
the one point QC check 95% probability
intervals at PQAO level of aggregation
1) 40 CFR Part 58 App A Section 3.2.2
2) Recommendation
3) 40 CFR Part 58 App A sec 4.1.4 & 4.1.5
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 13 of 48
SO
2
Validation Template
1) Requirement (SO
2
) 2) Frequency 3) Acceptance Criteria Information /Action
CRITICAL CRITERIA- SO
2
One Point QC Check
Single analyzer
1/ 2 weeks <+10% (percent difference)
1 and 2) 40 CFR Part 58 App A Sec 3.2
3) Recommendation based on DQO in 40 CFR Part 58
App A Sec 2.3.1.2
QC Check Conc range 0.01 - 0.10 ppm
Relative to routine concentrations
Zero/span check
1/ 2 weeks
Zero drift <+1.5 ppb
Span drift <+10 %
1 and 2) QA Handbook Volume 2 Section 12.3
3) Recommendation and related to DQO
OPERATIONAL CRITERIA- SO
2
Shelter Temperature Range
Daily
(hourly values)
20 to 30
o
C. (Hourly avg)
or
per manufacturers specifications if designated
to a wider temperature range
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Generally the 20-30
o
C range will apply but the most
restrictive operable range of the instruments in the
shelter may also be used as guidance. FRM/FEM list
found on AMTIC provides temp. range for given
instrument.FRM/FEM monitor testing is required at 20-
30
o
C range per 40 CFR Part 53.32
Shelter Temperature Control Daily (hourly values) <+2
o
C SD over 24 hours
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Shelter Temperature Device
Check
1/6 mo +2
o
C of standard
1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Annual Performance
Evaluation Single Analyzer
Every site 1/year 25 % of sites quarterly
Percent difference of audit levels 3-10 < +15%
Audit levels 1&2 +1.5 ppb difference or +15%
1 and 2) 40 CFR Part 58 App A sec 3.2.2
3) Recommendation - 3 audit concentrations not
including zero. AMTIC guidance 2/17/2011
http://www.epa.gov/ttn/amtic/cpreldoc.html
Federal Audits (NPAP)
1/year at selected sites 20% of sites
audited
Audit levels 1&2 +1.5 ppb difference all other
levels percent difference +15%
1) 40 CFR Part 58 App A sec 2.4
2) NPAP adequacy requirements on AMTIC
3) NPAP QAPP/SOP
Verification/Calibration
Upon receipt/adjustment/repair/
installation/moving
1/6 months if manual zero/span
performed biweekly
1/year if continuous zero/span
performed daily
All points within + 2 % of calibration range of
best-fit straight line
1) 40 CFR Part 50 App A-1 Section 4
2 and 3) Recommendation
Multi-point calibration (0 and 4 upscale points)
Gaseous Standards All gas cylinders
NIST Traceable
(e.g., EPA Protocol Gas)
1) 40 CFR Part 50 App A-1 Section 4.1.6.1
2) NA Green book
3) 40 CFR Part 50 App F Section 1.3.1
Producers must participate in Ambient Air Protocol Gas
Verification Program 40 CFR Part 58 App A sec 2.6.1
Zero Air/ Zero Air Check 1/year
Concentrations below LDL
<0.1 ppm aromatic hydrocarbons
1) 40 CFR Part 50 App A-1 Section 4.1.6.2
2) Recommendation
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 14 of 48
1) Requirement (SO
2
) 2) Frequency 3) Acceptance Criteria Information /Action
3) Recommendation and 40 CFR Part 50 App A-1
Section 4.1.6.2
Gas Dilution Systems
1/year or after failure of 1 point QC
check or performance evaluation
Accuracy + 2 %
1) 40 CFR Part 50 App A-1sec 4.1.2
2) Recommendation
3) 40 CFR Part 50 App A-1 sec 4.1.2
Detection (FEM/FRMs)
Noise NA
0.001 ppm (standard range)
0.0005 ppm (lower range)
1) 40 CFR Part 53.23 (b) (definition & procedure)
2) NA
3) ) 40 CFR Part 53.20 Table B-1
Lower detectable level 1/year
0.002 ppm (standard range)
0.001 ppm (lower range)
1) 40 CFR Part 53.23 (c) (definition & procedure)
2) Recommendation
3) 40 CFR Part 53.20 Table B-1
SYSTEMATIC CRITERIA- SO
2
Sampler/Monitor NA
Meets requirements listed in FRM/FEM
designation
1) 40 CFR Part 58 App C Section 2.1 2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Standard Reporting Units All data ppb (final units in AQS) 1, 2 and 3) 40 CFR Part 50 App T Sec 2 (c)
Rounding convention for data
reported to AQS
All data
1 place after decimal with digits to right
truncated
1, 2 and 3) 40 CFR Part 50 App T Sec 2 (c)
Completeness 1 hour standard
Hour 75% of hour
Day- 75% hourly Conc
Quarter- 75% complete days
Years- 4 complete quarters
5-min value reported only for valid hours
1, 2 and 3) 40 CFR Part 50 App T Section 3 (b), (c)
More details in CFR on acceptable completeness.
5-min values or 5-min max value only reported for the
valid portion of the hour reported. If the hour is
incomplete no 5-min or 5-min max reported.
Sample Residence Time
Verification
1/year < 20 seconds
1) 40 CFR Part 58 App E, section 9 (c)
2) Recommendation
3) 40 CFR Part 58 App E, section 9 (c)
Sample Probe, Inlet, Sampling
train
All sites Borosilicate glass (e.g., Pyrex
) or Teflon
1, 2 and 3) 40 CFR Part 58 App E sec 9 (a)
FEP and PFA have been accepted as equivalent material
to Teflon. Replacement or cleaning is suggested as
1/year and more frequent if pollutant load or
contamination dictate
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Precision(using 1-point QC
checks)
Calculated annually and as appropriate
for design value estimates
90% CL CV < 10%
1) 40 CFR Part 58 App A sec 2.3.1.6 & 3.2.1
2) 40 CFR Part 58 App A sec 4 (b)
3) 40 CFR Part 58 App A sec 4.1.2
Bias (using 1-point QC checks)
Calculated annually and as appropriate
for design value estimates
95% CL < + 10%
1) 40 CFR Part 58 App A sec 2.3.1.6 & 3.2.1
2) 40 CFR Part 58 App A sec 4 (b)
3) 40 CFR Part 58 App A sec 4.1.3
Annual PE Primary QA
Organization (PQAO)
Evaluation
1/year
95% of audit percent differences fall within the
one point QC check 95% probability intervals
at PQAO level of aggregation
1) 40 CFR Part 58 App A Section 3.2.2
2) Recommendation
3) 40 CFR Part 58 App A sec 4.1.4 and 4.1.5
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 15 of 48
PM
2.5
Filter Based Local Conditions Validation Template
1) Criteria (PM2.5 LC ) 2) Frequency 3) Acceptable Range Information /Action
CRITICAL CRITERIA- PM
2.5
Filter Based Local Conditions
Field Activities
Filter Holding Times
Sample Recovery all filters <7 days 9 hours from sample end date 1, 2 and 3) 40 CFR Part 50 App L Sec 10.10
Sampling Period (including
multiple power failures)
all filters
1380-1500 minutes, or
value if < 1380 and exceedance of NAAQS
1/
midnight to midnight local standard time
1, 2 and 3) 40 CFR Part 50 App L Sec 3.3
See details if less than 1380 min sampled
Sampling Instrument
Average Flow Rate every 24 hours of op average within 5% of 16.67 liters/minute
1, 2 and 3) Part 50 App L Sec 7.4.3.1
Variability in Flow Rate every 24 hours of op CV < 2% 1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.2
One-point Flow Rate Verification 1/mo
+ 4% of transfer standard
+5% of flow rate design value
1, 2 and 3) 40 CFR Part 50, App.L, Sec 9.2.5 and 7.4.3.1
and 40 CFR Part 58, Appendix A Sec 3.2.3 & 3.3.2
Laboratory Activities
Post-sampling Weighing all filters
<10 days from sample end date if shipped at
ambient temp, or
<30 days if shipped below avg ambient (or 4
o
C
or below for avg sampling temps < 4
o
C ) from
sample end date
1, 2 and 3) 40 CFR Part 50 App L Sec 8..3.6
Filter Visual Defect Check
(unexposed)
all filters
Correct type & size and for pinholes, particles or
imperfections
1, 2 and 3) 40 CFR Part 50, App.L Sec 10.2
Filter Conditioning Environment
Equilibration all filters 24 hours minimum 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.5
Temp. Range all filters 24-hr mean 20-23
o
C 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.1
Temp.Control all filters + 2
o
C SD* over 24 hr 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.2
Humidity Range
all filters
24-hr mean 30% - 40% RH or
<5% sampling RH but > 20%RH
1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.3
Humidity Control all filters + 5% SD* over 24 hr. 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.4
Pre/post Sampling RH all filters difference in 24-hr means < + 5% RH 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.3.3
Balance all filters located in filter conditioning environment 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.3.2
OPERATIONAL EVALUATIONS TABLE PM
2.5
Filter Based Local Conditions
Field Activities
Sampling Instrument
Individual Flow Rates every 24 hours of op no flow rate excursions > +5% for > 5 min.
1/
1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.1
Filter Temp Sensor every 24 hours of op
no excursions of > 5
o
C lasting longer than 30 min
1/
1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.11.4
Routine Verifications
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 16 of 48
1) Criteria (PM2.5 LC ) 2) Frequency 3) Acceptable Range Information /Action
External Leak Check every 5 sampling events < 80 mL/min (see comment #1)
1) 40 CFR Part 50 App L, Sec 7.4.6.1
2) Method 2-12 Table 8-1
3) 40 CFR Part 50, App.L, Sec 7.4.6.1
Internal Leak Check every 5 sampling events < 80 mL/min
1) 40 CFR Part 50, App.L, Sec 7.4.6.2
2) Method 2-12 Table 8-1
3) 40 CFR Part 50, App.L, Sec 7.4.6.2
One-point Temp Verification 1/mo +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Pressure Verification 1/mo +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Annual Multi-point Verifications/Calibrations
Temperature multi-point
Verification/Calibration
on installation, then 1/yr +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.4
Pressure Verification/Calibration on installation, then 1/yr +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.5
Sampler BP verified against independent standard
verified against a lab primary standard that is certified as
NIST traceable 1/year
Flow Rate Multi-point
Verification/ Calibration
Electromechanical
maintenance or transport or
1/yr
+ 4% of transfer standard
1) 40 CFR Part 50, App.L, Sec 9.2.
2) 40 CFR Part 50, App.L, Sec 9.1.3, Method 2.12
Table 6-1
3) 40 CFR Part 50, App.L, Sec 9.2.5
Design Flow Rate Adjustment
at one-point or multi-point
verification/calibration
+ 2% of design flow rate
1,2 and 3) 40 CFR Part 50, App.L, Sec 9.2.6
Other Monitor Calibrations per manufacturers op manual per manufacturers operating manual 1,2 and 3) Recommendation
Precision
Collocated Samples
every 12 days for 15% of sites
by method designation
CV <10% of samples >3 g/m
3
1) and 2) Part 58 App A Sec 3.2.5
3 Recommendation based on DQO in 40 CFR Part 58
App A Sec 2.3.1.3
Accuracy
Temperature Audit 1/yr +2
o
C 1, 2 and 3) Method 2.12 Sec. 10.2.2 & Table 3-1
Pressure Audit 1/yr +10 mm Hg 1, 2 and 3) Method 2.12 Sec. 10.2.3 & Table 3-1
Semi Annual Flow Rate Audit 1/6 mo
+4% of audit standard
+5% of design flow rate
1 and 2) Part 58, App A, Sec 3.3.3
3) Method 2.12 Sec. 10.2.1 & Table 10-1
Monitor Maintenance
Impactor (WINs)
every 5 sampling events
cleaned/changed 1, 2,and 3) Method 2.12 Sec 8.3.1
Very Sharp Cut Cyclone every 30 days cleaned/changed 1,2 and 3) Recommendation
Inlet/downtube Cleaning every 15 sampling events cleaned 1,2 and 3) Method 2.12 Sec 9.3
Filter Chamber Cleaning 1/mo cleaned 1, 2 and 3) Method 2.12 Sec 9.3 and 9.4.1
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 17 of 48
1) Criteria (PM2.5 LC ) 2) Frequency 3) Acceptable Range Information /Action
Circulating Fan Filter Cleaning 1/mo cleaned/changed 1, 2 and 3) Method 2.12 Sec 9.3
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
Laboratory Activities
Filter Checks
Lot Blanks 9 filters per lot less than 15 g change between weighings
1, 2, 3) Recommendation and used to determine filter
stability of the lot of filters received from EPA or
vendor.
Exposure Lot Blanks 3 filters per lot less than 15 g change between weighings
1,2 and 3) Method 2.12 Sec. 7.7
Used for preparing a subset of filters for equilibration
Filter Integrity (exposed) each filter no visual defects 1,2 and 3) Method 2.12 Sec. 7.10
Filter Holding Times
Pre-sampling all filters < 30 days before sampling 1,2 and 3) 40 CFR Part 50, App.L Sec 8.3.5
Lab QC Checks
Field Filter Blank 10% or 1 per weighing session +30 gchange between weighings
1) 40 CFR Part 50, App.L Sec 8.3.7.1
2 and 3) Method 2.12 Sec. 7.7
Lab Filter Blank 10% or 1 per weighing session +15 gchange between weighings
1) 40 CFR Part 50, App.L Sec 8.3.7.2
2 and 3) Method 2.12 Sec. 7.7
Balance Check (working standards) beginning, 10th sample, end <3 +g 1,2 and 3) Method 2.12 Sec. 7.9
Duplicate Filter Weighing 1 per weighing session +15 gchange between weighings 1,2 and 3) Method 2.12 Sec 7.11
Microbalance Audit 1/yr
+0.050 mg or manufacturers specs, whichever is
tighter
1,2 and 3) Method 2.12 Sec. 10.2.6
Verification/Calibration
Lab Temperature 1/6 months +2
o
C
1) Method 2.12 Table 3-2
2) Recommendation. Table 3-2 suggests every 3 mo.
3) Method 2.12 Table 3-2
Lab Humidity 1/6 months +2%
1) Method 2.12 Table 3-2
2) Recommendation Table 3-2 suggests every 3 mo.
3) Method 2.12 Table 3-2
Microbalance Calibration
At installation and prior to
each weighing session
1/yr
Manufacturers specification
1) 40 CFR Part 50, App.L, Sec 8.1
2) 40 CFR Part 50, App.L, Sec 8.1 and Method 2.12
Sec. 7.2
3) NA
Calibration & Check Standards -
Working Mass Stds. (compare to
primary standards)
Primary standards
1/3 mo.
1/yr
0.025 mg
0.025 mg
1, 2 and 3) Method 2.12 Sec 4.3 and 7.3
SYSTEMATIC CRITERIA -PM
2.5
Filter Based Local Conditions
Sampler/Monitor NA
Meets requirements listed in FRM/FEM/ARM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 18 of 48
1) Criteria (PM2.5 LC ) 2) Frequency 3) Acceptable Range Information /Action
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Data Completeness
Annual Standard > 75% scheduled sampling days in each quarter 1, 2 and 3) 40 CFR Part 50, App. N, Sec. 4.1 (b) 4.2 (a)
24- Hour Standard > 75% scheduled sampling days in each quarter 1, 2 and 3) 40 CFR Part 50, App. N, Sec. 4.1 (b) 4.2 (a)
Reporting Units all filters g/m
3
at ambient temp/pressure (PM
2.5
)
1. 2 and 3) 40 CFR Part 50 App N Sec 3.0 (b)
Rounding convention for data
reported to AQS
all filters
to one decimal place, with additional digits to the
right being truncated
1. 2 and 3) 40 CFR Part 50 App N Sec 3.0 (b)
Annual 3-yr average all concentrations nearest 0.1 g/m
3
(>0.05 round up)
1,2 and 3) 40 CFR Part 50, App. N Sec 3 and 4
Rounding convention for data reported to AQS is a
recommendation
24-hour, 3-year average all concentrations nearest 1 g/m
3
(>0.5 round up)
1,2 and 3) 40 CFR Part 50, App. N Sec 3 and 4
Rounding convention for data reported to AQS is a
recommendation
Detection Limit
Lower DL all filters 2 g/m
3
1,2 and 3) 40 CFR Part 50, App.L Sec 3.1
Upper Conc. Limit all filters 200 g/m
3
1,2 and 3) 40 CFR Part 50, App.L Sec 3.2
Precision
Single analyzer (collocated
monitors)
1/3 mo.
Coefficient of variation (CV) <10% for values >3
g/m
3
1,2 and 3) Recommendation in order to provide early
(quarterly) evaluation of achievement of DQOs.
Primary Quality Assurance Org. Annual and 3 year estimates 90% CL of CV < 10% for values > 3 g/m
3
1,2 and 3) 40 CFR Part 58, App A, Sec 4.3.1 and 2.3.1.1
.
Bias
Performance Evaluation Program
(PEP)
5 audits for PQAOs with < 5
sites
8 audits for PQAOs with > 5
sites
+10% for values > 3 g/m
3
1,2 and 3) 40 CFR Part 58, App A, Sec 3.2.7, 4.3.2 and
2.3.1.1
Field Activities
Verification/Calibration Standards Recertifications All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr + 2% of NIST Traceable Std.
1) 40 CFR Part 50, App.L Sec 9.1 & 9.2
2) Method 2-12 Section 6.3.3 and Table 3-1
3) 40 CFR Part 50, App.L Sec 9.1 & 9.2
Field Thermometer 1/yr +0.1
o
C resolution, +0.5
o
C accuracy 1, 2 and 3) Method 2.12 Sec 4.2.2 & Table 3-1
Field Barometer 1/yr +1 mm Hg resolution, +5 mm Hg accuracy 1, 2 and 3) Method 2.12 Sec 4.2.2 & Table 3-1
Clock/timer Verification 1/mo 1 min/mo
1and 2) Method 2.12 Table 3-1
3) 40 CFR Part 50, App.L Sec 7.4.12
Laboratory Activities
Microbalance Readability at purchase 1 g 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.1
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 19 of 48
1) Criteria (PM2.5 LC ) 2) Frequency 3) Acceptable Range Information /Action
Microbalance Repeatability 1/yr 1 g
1) Method 2.12 Sec 4.3.6
2) Recommendation
3) Method 2.12 Sec 4.3.6
Primary Mass.
Verification/CalibrationStandards
Recertifications
1/yr 0.025 mg
1, 2 and 3) Method 2.12 Sec 4.3.7 & Table 3-2
Comment #1
Its stated in the CFR that the criteria is <80mL/min. Exactly what samplers use this unit of measure? Most, if not all samplers that I know of use either the in Hg or the mmHg unit.
How can you convert a liquid unit of measure to a pressure unit of measure? Is there any way to change or add more applicable units to ease the confusion? The following is in the
PM2.4 PEP SOP. To pass the test, the actively displayed differential system pressure (shown on the right side of the screen as SP) must not drop by more than 5-cm of water during the
2-minute time interval (or 10-cm of water if using a 10-minute time interval). This is equivalent to the 80 mL/min acceptance criteria stated in related QA documents.
1/ value must be flagged SD * = standard deviation CV= coefficient of variation
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 20 of 48
Continuous PM2.5 Local Conditions Validation Template
NOTE: There may be a number of continuous monitors that may be designated as an FEM or an ARM. These monitors may have different measurement or sampling
attributes that cannot be identified in this validation template. Monitoring organizations should review specific instrument operating manuals to augment this validation
template as necessary. In general, 40 CFR Part 58 App A and 40 CFR Part 50 App L requirements apply to Continuous PM2.5
1) Criteria (PM2.5 Cont) 2) Frequency 3) Acceptable Range Information /Action
CRITICAL CRITERIA- PM
2.5
Continuous, Local Conditions
Sampling Period 24 hour estimate every sample period > 75% (18) of hourly averages
1,2and 3) 40 CFR Part 50 App N Sec 3 (c)
See additional details for sample periods less than
18 hours.
Hourly estimates Every hour Instrument dependent See operators manual
Sampling Instrument
Average Flow Rate every 24 hours of op average within 5% of 16.67 liters/minute 1, 2 and 3) Part 50 App L Sec 7.4.3.1
Variability in Flow Rate every 24 hours of op CV < 2% 1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.2
One-point Flow Rate Verification 1/mo
+ 4% of transfer standard
+5% of flow rate design value
1, 2 and 3) 40 CFR Part 50, App.L, Sec 9.2.5, 40
CFR Part 58, Appendix A Sec 3.2.3 & 3.3.2
BAM Specific Critical Criteria
Reference Membrane Span Foil
Verification (BAM)
Hourly +4% of ABS Value
1,2 and 3) BAM 1020 Operation Manual
OPERATIONAL CRITERIA- PM
2.5
Continuous, Local Conditions
Annual Multi-point Verifications/Calibrations
Leak Check every 30 days
< 1.0 lpm BAM (Not Thermo BAMS)
+ 0.15 lpm TEOM
1) 40 CFR Part 50 App L, Sec 7.4.6.1
2) Recommendation
3) BAM SOP Sec 10.1.2
TEOM SOP Sec 10.1.6
Thermo BAM leak check should not be attempted.
Foils could be ruptured.
Temperature multi-point
Verification/Calibration
on installation, then 1/yr +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.4
One-point Temp Verification 1/mo +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Pressure Verification/Calibration on installation, then 1/yr +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.5
BP verified against independent standard verified
against a lab primary standard that is certified
NIST traceable 1/year
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 21 of 48
1) Criteria (PM2.5 Cont) 2) Frequency 3) Acceptable Range Information /Action
Flow Rate Multi-point
Verification/ Calibration
Electromechanical
maintenance or transport or
1/yr
+ 4% of transfer standard
1) 40 CFR Part 50, App.L, Sec 9.2.
2) 40 CFR Part 50, App.L, Sec 9.1.3, Method 2.12
Table 6-1
3) 40 CFR Part 50, App.L, Sec 9.2.5
Design Flow Rate Adjustment
at one-point or multi-point
verification/calibration
+ 2% of design flow rate
1,2 and 3) 40 CFR Part 50, App.L, Sec 9.2.6
Other Monitor Calibrations per manufacturers op manual per manufacturers operating manual
Precision
Collocated Samples
every 12 days for 15% of sites
by method designation
CV <10% of samples >3 g/m
3
1) and 2) Part 58 App A Sec 3.2.5
3 Recommendation based on DQO in 40 CFR Part
58 App A Sec 2.3.1.3
Accuracy
Temperature Audit 1/yr +2
o
C 1, 2 and 3) Method 2.12 Sec. 10.2.2 & Table 3-1
Pressure Audit 1/yr +10 mm Hg 1, 2 and 3) Method 2.12 Sec. 10.2.3 & Table 3-1
Semi Annual Flow Rate Audit 1/6 mo
+4% of audit standard
+5% of design flow rate
1 and 2) Part 58, App A, Sec 3.3.3
3) Method 2.12 Sec. 10.2.1 & Table 10-1
Shelter Temperature
Temperature range
Daily
(hourly values)
20 to 30
o
C. (Hourly avg)
or
per manufacturers specifications if designated to a wider
temperature range
Generally the 20-30
o
C range will apply but the
most restrictive operable range of the instruments
in the shelter may also be used as guidance
Temperature Control Daily (hourly values) <+2
o
C SD over 24 hours 1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Temperature Device Check 1/6 mo +2
o
C 1, 2 and 3) QA Handbook Volume 2 Section 7.2.2
Monitor Maintenance
Virtual Impactor (VSCC) Every 30 days cleaned/changed 1,2 and 3) Recommendation
Inlet Cleaning Every 30 days cleaned 1,2 and 3) Method 2.12 Sec 9.3
Filter Chamber Cleaning Every 30 days cleaned 1,2 and 3) Method 2.12 Sec 9.3
Circulating Fan Filter Cleaning 1/mo cleaned/changed 1,2 and 3) Method 2.12 Sec 9.3
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
TEOM Specific Operational Criteria
Total Flow Verification every 30 days
Sum of flow rates from 3 paths equal design
flow rate +5%
1,2 and 3) TEOM SOP Sec 10.1.2
Bypass leak check (TEOM) every 30 days +0.60 lpm
1,2 and 3) TEOM SOP Sec 10.1.6 or TEOM
Operating Manual Sec 5-4
Replace TEOM filters every 30 days As filter loading approached 100% 1,2 and 3) TEOM SOP Sec 10.1.8
Replace the 47-mm FDMS (Purge)
filters
every 30 days or any time
TEOM filters are replaced
replaced
1,2 and 3) TEOM SOP Sec 10.1.10
Internal/External Data Logger Data
Every 30 days
10 randomly selected values
agree exactly (digital) and +1 g/m
3
(analog)
1, 2 and 3) TEOM SOP Sec 10.1.24
Replace In-line filters 1/6 mo replaced 1, 2 and 3) TEOM SOP Sec 10.2
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 22 of 48
1) Criteria (PM2.5 Cont) 2) Frequency 3) Acceptable Range Information /Action
Clean cooler assembly 1/yr cleaned 1, 2 and 3) TEOM SOP Sec 10.3.1
Clean/Maintain switching valve 1/yr cleaned 1, 2 and 3) TEOM SOP Sec 10.3.2
Clean air inlet system of mass
transducer enclosure
1/yr cleaned
1, 2 and 3) TEOM SOP Sec 10.3.3
Replace the dryers 1/yr or due to poor performance replaced 1, 2 and 3) TEOM SOP Sec 10.3.4
Calibration (KO) constant
verification
1/yr
Pass or Fail
(<2.5%)
1, 2 TEOM SOP Sec 10.3.6
3) 1405-DF operating guide. Verification software
either passes or fails the verification. Acceptance
criteria is <2.5 %
Rebuild sampling pump 18 months <66% of local pressure 1, 2 and 3) TEOM SOP Sec 10.4
GRIMM Specific Operational Criteria
Internal rinsing air filter After a few years Changed
1, 2 and 3) GRIMM SOP Sec 12.4
May require a trained service staff to change. May
only require changing if a message reads check
nozzle and air inlet
Change Dust Filter 1/year Changed 1, 2 and 3) GRIMM SOP Sec 12.3
BAM Specific Operational Criteria
Cleaning Nozzle and Van (BAM) Every 30 days cleaned
1, 2 and 3) BAM SOP Sec 10.1.3
Replace or Clean pump Muffler 1/6 mo Cleaned or changed
Internal/External Data Logger Data
(BAM)
Every 30 days
10 randomly selected values
agree exactly (digital) and +1 g/m
3
(analog)
1, 2 and 3) BAM SOP Sec 10.1.9
Capstan shaft and pinch roller
cleaning (BAM)
Every 30 days cleaned
1, 2 and 3) BAM SOP Sec 10.1.3
Smart Heater Test 1/6 mo Heater turns when forced off 1, 2 and 3) BAM SOP Sec 10.3.3
Clean/replace internal debris filter 1/year
72-Hour zero filter test At installation and 1/year 1, 2 and 3) BAM SOP Sec 9.6.10
Check of membrane span foil 1/year Avg. <+5% of ABS 1, 2 and 3) BAM SOP Sec 10.4.3
Beta detector count rate 1/year Between 600,00 and 1,000,000 1, 2 and 3) BAM SOP Sec 10.4.4
SYSTEMATIC CRITERIA- PM
2.5
Continuous, Local Conditions
Sampler/Monitor NA
Meets requirements listed in FRM/FEM/ARM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Data Completeness quarterly >75% Part 50, App. N, Sec. 4.1 (b) 4.2 (a)
Reporting Units all filters g/m
3
at ambient temp/pressure (PM
2.5
)
1. 2 and 3) 40 CFR Part 50 App N Sec 3.0 (b)
Rounding convention for data
reported to AQS
all filters
to one decimal place, with additional digits to the right
being truncated
1. 2 and 3) 40 CFR Part 50 App N Sec 3.0 (b)
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 23 of 48
1) Criteria (PM2.5 Cont) 2) Frequency 3) Acceptable Range Information /Action
Annual 3-yr average all concentrations nearest 0.1 g/m
3
(>0.05 round up)
1,2 and 3) 40 CFR Part 50, App. N Sec 3 and 4
Rounding convention for data reported to AQS is a
recommendation
24-hour, 3-year average all concentrations nearest 1 g/m
3
(>0.5 round up)
1,2 and 3) 40 CFR Part 50, App. N Sec 3 and 4
Rounding convention for data reported to AQS is a
recommendation
Detection Limit
Lower DL all filters 2 g/m
3
1,2 and 3) 40 CFR Part 50, App.L Sec 3.1
Upper Conc. Limit all filters 200 g/m
3
1,2 and 3) 40 CFR Part 50, App.L Sec 3.2
Verification/Calibration Standards Recertifications - All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr +2% of NIST Traceable Std. Part 50, App.L Sec 9.1 & 9.2
Field Thermometer 1/yr +0.1
o
C resolution, +0.5
o
C accuracy Method 2.12 Sec 4.2.2
Field Barometer 1/yr +1 mm Hg resolution, +5 mm Hg accuracy Method 2.12 Sec 4.2.2
Calibration & Check Standards
Flow Rate Transfer Std. 1/yr +2% of NIST-traceable Std. Part 50, APP L, Sec 9.1 & 9.2
Verification/Calibration
Clock/timer Verification 1/mo 1 min/mo** Part 50, App.L, Sec 7.4
Precision
Single analyzer (collocated
monitors)
1/3 mo. Coefficient of variation (CV) <10% for values >3 g/m
3
1,2 and 3) Recommendation in order to provide
early (quarterly) evaluation of achievement of
DQOs.
Primary Quality Assurance Org. Annual and 3 year estimates 90% CL of CV < 10% for values > 3 g/m
3
1,2 and 3) 40 CFR Part 58, App A, Sec 4.3.1 and
2.3.1.1 .
Bias
Performance Evaluation Program
(PEP)
5 audits for PQAOs with < 5
sites
8 audits for PQAOs with > 5
sites
+10% for values > 3 g/m
3
1,2 and 3) 40 CFR Part 58, App A, Sec 3.2.7, 4.3.2
and 2.3.1.1
1/ value must be flagged due to current implementation of BAM ( sampling 42 minute/hour) only 1008 minutes of sampling in 24 hour period
SD= standard deviation , CV= coefficient of variation
** = need to ensure data system stamps appropriate time period with reported sample value
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 24 of 48
PM10c for PM
10-2.5
Low Volume , Filter-Based Local Conditions Validation Template
NOTE: The following validation template was constructed for use of PM
10
at local conditions where PM
10c
is used in the calculation of the PM
10-2.5
measurement or for
objectives other than comparison to the PM
10
NAAQS. Although the PM
10-2.5
method is found in 40 CFR Part 50 Appendix O, Appendix O references Appendix L (the
PM
2.5
Method) for the QC requirements listed below. Therefore, the information action column, in most cases, will reference 40 CFR Part 50 App L. Monitoring
organizations using PM
10
data for a NAAQS comparison purposes should refer to the PM
10
validation template for STP (standard temperature and pressure correction).
In addition, since the samplers are very similar to the PM2.5 samplers, Guidance Document 2.12 Monitoring PM2.5 in Ambient Air Using Designated Reference or Class
1 Equivalent Methods is referred to where appropriate.
1) Criteria (PM10c ) 2) Frequency 3) Acceptable Range Information /Action
CRITICAL CRITERIA- PM10c Filter Based Local Conditions
Field Activities
Filter Holding Times
Sample Recovery all filters <7 days 9 hours from sample end date 1, 2 and 3) 40 CFR Part 50 App L Sec 10.10
Sampling Period (including
multiple power failures)
all filters
1380-1500 minutes, or
value if < 1380 and exceedance of NAAQS
1/
midnight to midnight local standard time
1, 2 and 3) 40 CFR Part 50 App L Sec 3.3
See details if less than 1380 min sampled
Sampling Instrument
Average Flow Rate every 24 hours of op average within 5% of 16.67 liters/minute
1, 2 and 3) Part 50 App L Sec 7.4.3.1
Variability in Flow Rate every 24 hours of op CV < 2% 1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.2
One-point Flow Rate Verification 1/mo
+ 4% of transfer standard
+5% of flow rate design value
1, 2 and 3) 40 CFR Part 50, App.L, Sec 9.2.5, 40 CFR
Part 58, Appendix A Sec 3.2.3 & 3.3.2
Laboratory Activities
Post-sampling Weighing all filters
<10 days from sample end date if shipped at
ambient temp, or
<30 days if shipped below avg ambient (or 4
o
C
or below for avg sampling temps < 4
o
C ) from
sample end date
1, 2 and 3) 40 CFR Part 50 App L Sec 8..3.6
Filter Visual Defect Check
(unexposed)
all filters
Correct type & size and for pinholes, particles or
imperfections
1, 2 and 3) 40 CFR Part 50, App.L Sec 10.2
Filter Conditioning Environment
Equilibration all filters 24 hours minimum 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.5
Temp. Range all filters 24-hr mean 20-23
o
C 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.1
Temp.Control all filters + 2
o
C SD* over 24 hr 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.2
Humidity Range
all filters
24-hr mean 30% - 40% RH or
<5% sampling RH but > 20%RH
1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.3
Humidity Control all filters + 5% SD* over 24 hr. 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.4
Pre/post Sampling RH all filters difference in 24-hr means < + 5% RH 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.3.3
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 25 of 48
1) Criteria (PM10c ) 2) Frequency 3) Acceptable Range Information /Action
Balance all filters located in filter conditioning environment 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.3.2
OPERATIONAL EVALUATIONS TABLE- PM10c Filter Based Local Conditions
Field Activities
Sampling Instrument
Individual Flow Rates every 24 hours of op no flow rate excursions > +5% for > 5 min.
1/
1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.1
Filter Temp Sensor every 24 hours of op
no excursions of > 5
o
C lasting longer than 30 min
1/
1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.11.4
Routine Verifications
External Leak Check every 5 sampling events < 80 mL/min (see comment #1)
1) 40 CFR Part 50 App L, Sec 7.4.6.1
2) Method 2-12 Table 8-1
3) 40 CFR Part 50, App.L, Sec 7.4.6.1
Internal Leak Check every 5 sampling events < 80 mL/min
1) 40 CFR Part 50, App.L, Sec 7.4.6.2
2) Method 2-12 Table 8-1
3) 40 CFR Part 50, App.L, Sec 7.4.6.2
One-point Temp Verification 1/mo +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Pressure Verification 1/mo +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Annual Multi-point Verifications/Calibrations
Temperature multi-point
Verification/Calibration
on installation, then 1/yr +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.4
Pressure Verification/Calibration on installation, then 1/yr +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.5
Sampler BP verified against independent standard
verified against a lab primary standard that is certified
as NIST traceable 1/year
Flow Rate Multi-point
Verification/ Calibration
Electromechanical
maintenance or transport or
1/yr
+ 4% of transfer standard
1) 40 CFR Part 50, App.L, Sec 9.2.
2) 40 CFR Part 50, App.L, Sec 9.1.3, Method 2.12
Table 6-1
3) 40 CFR Part 50, App.L, Sec 9.2.5
Design Flow Rate Adjustment
at one-point or multi-point
verification/calibration
+ 2% of design flow rate
1,2 and 3) 40 CFR Part 50, App.L, Sec 9.2.6
Other Monitor Calibrations per manufacturers op manual per manufacturers operating manual 1,2 and 3) Recommendation
Precision
Collocated Samples
every 12 days for 15% of sites
by method designation
CV <15% of samples >3 g/m
3
1) and 2) Part 58 App A Sec 3.2.5
3 Recommendation based on DQO in 40 CFR Part 58
App A Sec 2.3.1.3
Accuracy
Temperature Audit 1/yr +2
o
C 1, 2 and 3) Method 2.12 Sec. 10.2.2 & Table 3-1
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 26 of 48
1) Criteria (PM10c ) 2) Frequency 3) Acceptable Range Information /Action
Pressure Audit 1/yr +10 mm Hg 1, 2 and 3) Method 2.12 Sec. 10.2.3 & Table 3-1
Semi Annual Flow Rate Audit 1/6 mo
+4% of audit standard
+5% of design flow rate
1 and 2) Part 58 App A, Sec 3.3.3
3) Method 2.12 Sec. 10.2.1 & Table 10-1
Monitor Maintenance
Impactor (WINs)
every 5 sampling events
cleaned/changed
1, 2,and 3) Method 2.12 Sec 8.3.1
Very Sharp Cut Cyclone every 30 days cleaned/changed 1,2 and 3) Recommendation
Inlet/downtube Cleaning every 15 sampling events cleaned 1,2 and 3) Method 2.12 Sec 9.4.1
Filter Chamber Cleaning 1/mo cleaned 1, 2 and 3) Method 2.12 Sec 9.3
Circulating Fan Filter Cleaning 1/mo cleaned/changed 1, 2 and 3) Method 2.12 Sec 9.3
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
Laboratory Activities
Filter Checks
Lot Blanks 9 filters per lot less than 15 g change between weighings
1, 2 and 3) Recommendation and used to determine
filter stability of the lot of filters received from EPA or
vendor.
Exposure Lot Blanks 3 filters per lot less than 15 g change between weighings
1,2 and 3) Method 2.12 Sec. 7.7
Used for preparing a subset of filters for equilibration
Filter Integrity (exposed) each filter no visual defects 1,2 and 3) Method 2.12 Sec. 7.10
Filter Holding Times
Pre-sampling all filters < 30 days before sampling 1,2 and 3) 40 CFR Part 50, App.L Sec 8.3.5
Lab QC Checks
Field Filter Blank 10% or 1 per weighing session +30 gchange between weighings
1) 40 CFR Part 50, App.L Sec 8.3.7.1
2 and 3) Method 2.12 Sec. 7.7
Lab Filter Blank 10% or 1 per weighing session +15 gchange between weighings
1) 40 CFR Part 50, App.L Sec 8.3.7.2
2 and 3) Method 2.12 Sec. 7.7
Balance Check (working standards) beginning, 10th sample, end <3 g 1,2 and 3) Method 2.12 Sec. 7.9
Duplicate Filter Weighing 1 per weighing session +15 gchange between weighings 1,2 and 3) Method 2.12 Sec 7.11
Microbalance Audit 1/yr
+0.050 mg or manufacturers specs, whichever is
tighter
1,2 and 3) Method 2.12 Sec. 10.2.6
Verification/Calibration
Lab Temperature 1/6 months +2
o
C
1) Method 2.12 Table 3-2
2) Recommendation Table 3-2 suggests every 3 mo.
3) Method 2.12 Table 3-2
Lab Humidity 1/6 months +2%
1) Method 2.12 Table 3-2
2) Recommendation Table 3-2 suggests every 3 mo.
3) Method 2.12 Table 3-2
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 27 of 48
1) Criteria (PM10c ) 2) Frequency 3) Acceptable Range Information /Action
Microbalance Calibration
At installation and prior to
each weighing session
1/yr
Manufacturers specification
1) 40 CFR Part 50, App.L, Sec 8.1
2) 40 CFR Part 50, App.L, Sec 8.1 and Method 2.12
Sec. 7.2
3) NA
Calibration & Check Standards -
Working Mass Stds. (compare to
primary standards)
1/3 mo. 0.025 mg
1, 2 and 3) Method 2.12 Sec 4.3 and 7.3
SYSTEMATIC CRITERIA - PM10c Filter Based Local Conditions
Sampler/Monitor NA
Meets requirements listed in FRM/FEM/ARM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Data Completeness NA >75% scheduled sampling days in each quarter
1, 2 and 3) Recommendation based on PM2.5
requirements in 40 CFR Part 50, App. N, Sec. 4.1 (b)
4.2 (a)
Reporting Units all filters g/m
3
at ambient temp/pressure (PM
2.5
)
1. 2 and 3) 40 CFR Part 50 App N
Rounding convention for data
reported to AQS
all concentrations nearest 0.1 g/m
3
(>0.05 round up)
1,2 and 3) Recommendation based on PM2.5
requirements 40 CFR Part 50 App N sect 4.3
Detection Limit
Lower DL all filters <3 g/m
3
1,2 and 3) 40 CFR Part 50, App O Sec 3.1
Upper Conc. Limit all filters >200 g/m
3
1,2 and 3) 40 CFR Part 50, App.O Sec 3.2
Precision
Single analyzer (collocated
monitors)
1/3 mo.
Coefficient of variation (CV) <10% for values >3
g/m
3
1,2 and 3) Recommendation in order to provide early
evaluation of achievement of DQOs.
Primary Quality Assurance Org. Annual and 3 year estimates 90% CL of CV < 10% for values > 3 g/m
3
1,2 and 3) 40 CFR Part 58, App A Sec 4.3.1 and 2.3.1.1
Bias
Performance Evaluation Program
(PEP)
5 audits for PQAOs with < 5
sites
8 audits for PQAOs with > 5
sites
+10% for values > 3 g/m
3
1, 2 and 3) 40 CFR Part 58, App A, Sec 3.2.7, 4.3.2 and
2.3.1.1
Field Activities
Verification/Calibration Standards Recertifications All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr + 2% of NIST-traceable Std.
1) 40 CFR Part 50, App.L Sec 9.1 & 9.2
2) Method 2-12 Section 6.3.3 and Table 3-1
3) 40 CFR Part 50, App.L Sec 9.1 & 9.2
Field Thermometer 1/yr +0.1
o
C resolution, +0.5
o
C accuracy 1, 2 and 3) Method 2.12 Sec 4.2.2 & Table 3-1
Field Barometer 1/yr +1 mm Hg resolution, +5 mm Hg accuracy 1, 2 and 3) Method 2.12 Sec 4.2.2 & Table 3-1
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 28 of 48
1) Criteria (PM10c ) 2) Frequency 3) Acceptable Range Information /Action
Verification/Calibration
Clock/timer Verification
1/mo 1 min/mo
1 and 2) Method 2.12 Table 3-1
3) 40 CFR Part 50, App.L, Sec 7.4.12
Laboratory Activities
Microbalance Readability at purchase 1 g 1, 2 and 3) ) 40 CFR Part 50, App.L, Sec 8.1
MicrobalanceRepeatability 1/yr 1 g
1) Method 2.12 Sec 4.3.6
2) Recommendation
3) Method 2.12 Sec 4.3.6
Primary Mass Stds. 1/yr 0.025 mg 1, 2 and 3) Method 2.12 Sec 4.3.7 & Table 3-2
Comment #1
Its stated in the CFR that the criteria is <80mL/min. Exactly what samplers use this unit of measure? Most, if not all samplers that I know of use either the in Hg or the mmHg unit.
How can you convert a liquid unit of measure to a pressure unit of measure? Is there any way to change or add more applicable units to ease the confusion? The following is in the
PM2.4 PEP SOP. To pass the test, the actively displayed differential system pressure (shown on the right side of the screen as SP) must not drop by more than 5-cm of water during the
2-minute time interval (or 10-cm of water if using a 10-minute time interval). This is equivalent to the 80 mL/min acceptance criteria stated in related QA documents.
1/ value must be flagged , SD= standard deviation, CV= coefficient of variation
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 29 of 48
PM
10
Filter Based Dichot STP Conditions Validation Template
1) Criteria (PM10 Dichot
STP) 2) Frequency 3) Acceptable Range Information /Action
CRITICAL CRITERIA- PM
10
Filter Based Dichot
Field Activities
Filter Holding Times
Sample Recovery all filters ASAP 1,2 and 3) 40 CFR Part 50 App J sec 9.15
Sampling Period all filters
1440 minutes + 60 minutes
midnight to midnight local standard time
1,2 and 3) 40 CFR Part 50 App J sec 7.1.5
Sampling Instrument
Average Flow Rate every 24 hours of op average 16.67 liters/minute 1,2 and 3) Method 2.10 sec 2.1
Verification/Calibration
One-point Flow Rate Verification 1/mo +7% of transfer standard 1,2 and 3) Method 2.10 Table 3-1
Lab Activities
Filter
Visual Defect Check (unexposed) all filters see reference 1,2 and 3) Method 2.10 sec 4.2
Collection efficiency
all filters
> 99 %
1,2 and 3) Part 50, App J sec 7.2.2
Alkalinity all filters < 25.0 microequivalents/gram 1,2 and 3) 40 CFR Part 50, App J sec 7.2.4
Filter Conditioning Environment
Equilibration all filters 24 hours minimum 1,2 and 3) 40 CFR Part 50, App.J sec 9.3
Temp. Range all filters 15-30
o
C 1,2 and 3) 40 CFR Part 50, App.J sec 7.4.1
Temp.Control
all filters + 3
o
C SD* over 24 hr
1,2 and 3) 40 CFR Part 50, App.J sec 7.4.2
SD statistic is recommendation
Humidity Range all filters 20% - 45% RH 1,2 and 3) 40 CFR Part 50, App.J sec 7.4.3
Humidity Control
all filters + 5% SD* over 24 hr
1,2 and 3) 40 CFR Part 50, App.J sec 7.4.4
SD use is recommendation
Pre/post Sampling RH
all filters difference in 24-hr means <+5% RH
1,2 and 3) Recommendation based on 40 CFR Part 50,
App.L sec 8.3.3
Balance
all filters located in filter conditioning environment
1,2 and 3) Recommendation based on 40 CFR Part 50,
App.L sec 8.3.2
OPERATIONAL EVALUATIONS TABLE PM
10
Filter Based Dichot
Field Activities
Verification/Calibration
System Leak Check During precalibration check
Vacuum of 10 to 15 in. & rate of decline to 0
>60 seconds
1,2 and 3) Method 2.10 sec 2.2.1
FR Multi-point
Verification/Calibration
1/yr
Correlation coefficient of >.990 with no point
deviating more than 0.5 L/min for total or 0.05
L/min for coarse
1) 40 CFR Part 50, App.J , sec 8.0
2 and 3) Method 2.10 Sec 2.2.4
Field Temp M-point Verification on installation, then 1/yr +2
o
C 1,2 and 3) Recommendation based on Part 50, App.L
Precision
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 30 of 48
1) Criteria (PM10 Dichot
STP) 2) Frequency 3) Acceptable Range Information /Action
Collocated Samples every 12 days for 15% of sites
5 g/m
3
for concentrations below 80g/m
3
and
7% for concentrations above 80g/m
3
1 and 2) 40 CFR Part 58 App A sec 3.3.1
3) Part 50, App J sec 4.1
Semi Annual Flow Rate Audit 1/6 mo +10% of audit standard
1 and 2) 40 CFR Part 58, App A, sec 3.3.3
3) Method 2.10 Sec 7.1.5
Monitor Maintenance
Impactor 1/3 mo cleaned/changed 1,2 and 3) Method 2.10 sec 6.1.2
Inlet/downtube Cleaning 1/3 mo cleaned 1,2 and 3) Method 2.10 sec 6.1.2
Vacuum pump 1/yr Replace diaphragm and flapper valves 1,2 and 3) Method 2.10 sec 6.1.3
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
Lab Activities
Balance Check beginning, 10th sample, end
4 gof true zero
<2 gof 10 mg check weight
1,2 and 3) Method 2 .10 sec 4.5
Standard filter QC check 10% +20 gchange from original value
1,2 and 3) Method 2.10 sec 4.5
From standard non-routine filter
Routine duplicate weighing 5-7 per weighing session +20 gchange from original value
1,2 and 3) Method 2.10 sec 4.5
From routine filter set
Integrity- Random sample of test
field blank filters
10% + 5 g/m
3
1) 40 CFR Part 50 App J sec 7.2.3 2 and
2) Recommendation
3) 40 CFR Part 50 App J sec 7.2.3
Lab Temperature Calibration 1/6 months +2
o
C
1,2 and 3) Recommendation related to 40 CFR Part 50,
App.L
Lab Humidity Calibration 1/6 months +2%
1,2 and 3) Recommendation related to 40 CFR Part 50 App
L sec 5.8.1
Microbalance Calibration 1/yr Manufacturers specification
1,2 and 3) Recommendation related to 40 CFR Part 50 App
L
Filter Weighing Audit 1/yr +20 gchange from original value 1,2 and 3) Method 2.10 Table 7-1
Balance Audit 1/yr
Observe weighing technique and check balance
with ASTM Class 1 standard
1,2 and 3) Method 2.10 Table 7-1 section 7.2.2
SYSTEMATIC CRITERIA - PM
10
Filter Based Dichot
Sampler/Monitor NA
Meets requirements listed in FRM/FEM/ARM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Data Completeness 24- Hour Standard
> 75% scheduled sampling days in each
quarter
1,2 and 3) 40 CFR Part 50 App. K, sec. 2.3b
Reporting Units all filters g/m
3
at standard temperature and pressure
1,2 and 3) 40 CFR Part 50 App K
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 31 of 48
1) Criteria (PM10 Dichot
STP) 2) Frequency 3) Acceptable Range Information /Action
Rounding convention for data
reported to AQS
Each routine concentration
nearest 10 g/m
3
(> 5 round up)
1,2 and 3) 40 CFR Part 50 App K sec 2
Precision
Single analyzer 1/3 mo. Coefficient of variation (CV) <10% >3 g/m
3
1,2 and 3) Recommendation 3 g/m
3
cut off in 40 CFR
part 58 App A sec 4
Single analyzer 1/ yr CV <10% >3 g/m
3
1,2 and 3) Recommendation 3g/m
3
cut off in 40 CFR part
58 App A sec 4
Primary Quality Assurance Org. Annual and 3 year estimates 90% CL of CV < 10% >3 g/m
3
1,2 and 3) Recommendation 3g/m
3
cut off in 40 CFR part
58 App A sec 4
Field Activities
Verification/Calibration Standards and Recertifications - All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr + 2% of NIST-traceable Std. 1,2 and 3) 40 CFR Part 50 App J sec 7.3
Field Thermometer 1/yr +0.1
o
C resolution, +0.1
o
C accuracy 1,2 and 3) Method 2.10 section 1.1.2
Field Barometer 1/yr +1 mm Hg resolution, +5 mm Hg accuracy 1,2 and 3) Method 2.10 section 1.1.2
Clock/timer Verification 1/6 mo 15 min/day 1,2 and 3) Method 2.10 sec 9
Lab Activities
Microbalance at purchase Readability 1 g, Repeatability1 g 1,2 and 3) Method 2.10 sec 4.4
Primary Mass Stds. (compare to
NIST-traceable standards)
1/yr NIST traceable
(e.g., ANSI/ASTM Class 1, 1.1 or 2)
1,2 and 3) Method 2.10 sec 9
*SD= standard deviation CV= coefficient of variation
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 32 of 48
PM
10
Filter Based High Volume (HV) STP Conditions Validation Template
1) Criteria (PM10 Hi-Vol
STP) 2) Frequency 3) Acceptable Range Information /Action
CRITICAL CRITERIA- PM
10
Filter Based Hi-Vol
Field Activities
Filter Holding Times
Sample Recovery all filters ASAP 1,2 and 3) 40 CFR Part 50 App J sec 9.15
Sampling Period all filters
1440 minutes + 60 minutes
midnight to midnight local standard time
1,2 and 3) 40 CFR Part 50 App J sec 7.1.5
Average Flow Rate every 24 hours of op ~1.13 m
3
/min (varies with instrument) 1,2 and 3) Method 2.11
Verification/Calibration
One-point Flow Rate Verification 1/3 mo
+7% of transfer standard and 10% from
design
1 and 2) 40 CFR Part 58, App A, sec 3.2.3
3) Method 2.11 sec 3.5.1, Table 2-1
Lab Activities
Filter
Visual Defect Check (unexposed) all filters see reference Method 2.11 sec 4.2
Collection efficiency all filters 99 % 1,2 and 3) 40 CFR Part 50, App J sec 7.2.2
Alkalinity all filters < 25.0 microequivalents/gram 1,2 and 3) 40 CFR Part 50, App J sec 7.2.4
Filter Conditioning Environment
Equilibration all filters 24 hours minimum 1,2 and 3) 40 CFR Part 50, App.J sec 9.3
Temp. Range all filters 15-30
o
C 1,2 and 3) 40 CFR Part 50, App.J sec 7.4.1
Temp.Control all filters + 3
o
C SD* over 24 hr 1,2 and 3) 40 CFR Part 50, App.J sec 7.4.2
Humidity Range all filters 20% - 45% RH 1,2 and 3) 40 CFR Part 50, App.J sec 7.4.3
Humidity Control all filters + 5% SD* over 24 hr 1,2 and 3) 40 CFR Part 50, App.J sec 7.4.4
Pre/post Sampling RH all filters difference in 24-hr means <+5% RH 1,2 and 3) Recommendation based on Part 50, App.L sec 8.3.3
Balance all filters located in filter conditioning environment 1,2 and 3) Recommendation based on Part 50, App.L sec 8.3.2
OPERATIONAL EVALUATIONS TABLE PM
10
Filter Based Hi-Vol
Field Activities
Verification/Calibration
System Leak Check During precalibration check Auditory inspection with faceplate blocked 1,2 and 3) Method 2.11 sec 2.3.2
FR Multi-point
Verification/Calibration
1/yr 3 of 4 cal points within +10% of design
1, 2 and 3) Method 2.11 sec 2.3.2
Field Temp M-point Verification on installation, then 1/yr +2
o
C 1,2 and 3) Recommendation
Precision
Collocated Samples every 12 days for 15% of sites CV <10% of samples >15 g/m
3
1) and 2) 40 CFR Part 58 App A sec 3.2.5
3) Recommendation
Semi Annual Flow Rate Audit 1/6 mo
+7% of transfer standard and 10% from
design
1 and 2) 40 CFR Part 58, App A, sec 3.3.3
3) Method 2.11 sec 7 Table 7-1
Monitor Maintenance
Inlet/downtube Cleaning 1/3 mo cleaned 1, 2 and 3) Method 2.11 sec 6
Motor/housing gaskets 1/3 mo Inspected replaced 1, 2 and 3) Method 2.11 sec 6
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 33 of 48
1) Criteria (PM10 Hi-Vol
STP) 2) Frequency 3) Acceptable Range Information /Action
Blower motor brushes 600-1000 hours Replace 1, 2 and 3) Method 2.11 sec 6
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
NA
Lab Activities
Lab QC Checks
Balance Check (Standard Weight
Check and Calibration Check)
beginning, 15th sample, end
+ 0.5 mg of true zero and + 0.5 mg 1-5 g
check weight
1,2, and 3) Method 2 .11 sec 4.5.1 and 4.5.2
Routine duplicate weighing 5-7 per weighing session +2.8 mg change from original value
1,2 and 3) Method 2.11 sec 4.5.3
From routine filter set
Integrity- Random sample of test
field blank filters
10% + 5 g/m
3
1) 40 CFR Part 50 App J sec 7.2.3
2) Recommendation
3) 40 CFR Part 50 App J sec 7.2.3
Lab Temperature Calibration 1/6 months +2
o
C 1,2 and 3) Recommendation related to 40 CFR Part 50, App.L
Lab Humidity Calibration 1/6 months +2% 1,2 and 3) Recommendation related to 40 CFR Part 50 App L
Microbalance Calibration 1/yr Manufacturers specification
Audits
Filter Weighing 1/yr +5 mg change from original value
1) Method 2.11 Table 7-1
2) Recommendation
3) Method 2.11 Table 7-1
Balance Audit 1/yr
Observe weighing technique and check
balance with ASTM Class 1 standard
1) Method 2.11 Table 7-1
2) Recommendation
3) Method 2.11 Table 7-1
SYSTEMATIC CRITERIA - PM
10
Filter Based Hi-Vol
Sampler/Monitor NA
Meets requirements listed in
FRM/FEM/ARM designation
1) 40 CFR Part 58 App C, Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Data Completeness quarterly >75% 1,2 and 3) 40 CFR Part 50 App. K, sec. 2.3b & c
Reporting Units all filters g/m
3
at standard temperature and pressure
1,2 and 3) 40 CFR Part 50 App K sec. 1
Rounding convention for data
reported to AQS
Each routine concentration nearest 10 g/m
3
(> 5 round up)
1,2 and 3) 40 CFR Part 50 App K sec 1
Precision
Single analyzer 1/3 mo.
Coefficient of variation (CV) <10% >15
g/m
3
1,2 and 3) Recommendation
Single analyzer 1/ yr CV <10% >15 g/m
3
1,2 and 3) Recommendation
Primary Quality Assurance Org. Annual and 3 year estimates 90% CL of CV < 10% >15 g/m
3
1,2 and 3) Recommendation
Field Activities
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 34 of 48
1) Criteria (PM10 Hi-Vol
STP) 2) Frequency 3) Acceptable Range Information /Action
Verification/Calibration Standards and Recertifications - All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr +2% of NIST-traceable Std.
1) 40 CFR Part 50, App.J sec 7.3
2) Method 2.11 Sec 1.1.3
3) 40 CFR Part 50, App.J sec 7.3
Field Thermometer 1/yr +0.1
o
C resolution, +0.5
o
C accuracy 1,2 and 3) Method 2.11 Sec 1.1.2
Field Barometer 1/yr +1 mm Hg resolution, +5 mm Hg accuracy 1,2 and 3) Method 2.11 Sec 1.1.2
Clock/timer Verification 4/year 5 min/mo recommendation
Lab Activities
Microbalance at purchase Readability 0.1 mg Repeatability0.5 mg
(HV)
1 and 2) 40 CFR Part 50, App.J sec 7.5
3) Method 2.11 sec 4.4
Primary Mass Stds. (compare to
NIST-traceable standards)
1/yr NIST traceable
(e.g., ANSI/ASTM Class 1, 1.1 or 2)
1,2 and 3) Method 2.11 sec 9
SD= standard deviation CV= coefficient of variation
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 35 of 48
Continuos PM10 STP Conditions Validation Template
NOTE: There are a number of continuous PM10 monitors that are designated as FEM. These monitors may have different measurement or sampling attributes that
cannot be identified in this validation template. Monitoring organizations should review specific instrument operating manuals and augment the validation template with
QC information specific to their EPA reference or equivalent method designation and instrument. http://www.epa.gov/ttn/amtic/files/ambient/criteria/reference-
equivalent-methods-list.pdf. In general, 40 CFR Part 58 App A and 40 CFR Part 50 App J requirements apply to Continuous PM10. Since a guidance document was
never developed for continuous PM10, many of the requirements reflect a combination of manual and continuous PM2.5 requirements and are therefore considered
recommendations.
1) Criteria (PM
10
Cont) 2) Frequency 3) Acceptable Range Information /Action
CRITICAL CRITERIA- PM
10
Continuous
Sampling Period all filters
1440 minutes + 60 minutes
midnight to midnight local standard time
1,2 and 3) 40 CFR Part 50 App J sec 7.1.5
Average Flow Rate every 24 hours of op Average within +5% of design recommendation
Verification/Calibration
One-point Flow Rate Verification 1/mo +7% of transfer standard
1 and 2) 40 CFR Part 58, App A, sec 3.2.3
3) Method 2.10 Table 3-1
OPERATIONAL EVALUATIONS TABLE PM
10
Continuous
Verification/Calibration
System Leak Check During precalibration check Auditory inspection with faceplate blocked 1,2 and 3) Method 2.11 sec 2.3.2
FR Multi-point
Verification/Calibration
1/yr 3 of 4 cal points within +10% of design
1) 40 CFR Part 50 App J sec 8.0
2 and 3) Method 2.10 Sec 2.2.4
Audits
Semi Annual Flow Rate Audit 1/6 mo +10% of audit standard
1,2) Part 58, App A, sec 3.2.4
3) Method 2.10 Sec 7.1.5
Monitor Maintenance
Inlet/downtube Cleaning 1/3 mo cleaned 1,2 and 3) Method 2.10 sec 6.1.2
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
SYSTEMATIC CRITERIA - PM
10
Continuous
Sampler/Monitor NA
Meets requirements listed in
FRM/FEM/ARM designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Data Completeness
24-hour
quarterly
23 hours
>75%
Recommendation
40 CFR Part 50 App. K, sec. 2.3
Reporting Units all filters
g/m
3
at standard temperature and pressure
(STP)
40 CFR Part 50 App K
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 36 of 48
1) Criteria (PM
10
Cont) 2) Frequency 3) Acceptable Range Information /Action
Rounding convention for data
reported to AQS
24-hour, 3-year average quarterly nearest 10 g/m
3
(>5 round up)
40 CFR Part 50 App K sec 1
Verification/Calibration Standards and Recertifications - All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr + 2% of NIST-traceable Std. 1,2 and 3) 40 CFR Part 50 App J sec 7.3
Field Thermometer 1/yr +0.1
o
C resolution, +0.1
o
C accuracy 1,2 and 3) Method 2.10 section 1.1.2
Field Barometer 1/yr +1 mm Hg resolution, +5 mm Hg accuracy 1,2 and 3) Method 2.10 section 1.1.2
Clock/timer Verification 1/6 mo 15 min/day 1,2 and 3) Method 2.10 sec 9
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 37 of 48
PM
10
Low Volume STP Filter-Based Local Conditions Validation Template
Monitoring organizations can use low-volume PM instruments for PM
10
monitoring. However, PM
10
data collection for NAAQS purposes must be reported in standard
temperature and pressure (STP). 40 CFR Part 50 App J describes the reference method for PM
10
but thismethodwas promulgated for dichot and high volume methods
that have improved over the years. Since monitoring organization may be able to use the low volume methods for multiple uses (PM
10c
, PM
10
-Pb) it is suggested that the
validation criteria for this method follow the method requirements associated with the PM
2.5
which is Appendix L. Where there are particular requirement directly
related to the NAAQS evaluation App J will be used.
1) Criteria (PM10 Lo-Vol
STP) 2) Frequency 3) Acceptable Range Information /Action
CRITICAL CRITERIA- PM
10
Lo-Vol Filter Based STP
Field Activities
Filter Holding Times
Sample Recovery all filters <7 days 9 hours from sample end date 1, 2 and 3) 40 CFR Part 50 App L Sec 10.10
Sampling Period (including
multiple power failures)
all filters
1440 minutes + 60 minutes
midnight to midnight local standard time
1,2 and 3) 40 CFR Part 50 App J sec 7.1.5
Sampling Instrument
Average Flow Rate every 24 hours of op average within 5% of 16.67 liters/minute
1, 2 and 3) Part 50 App L Sec 7.4.3.1
Variability in Flow Rate every 24 hours of op CV < 2% 1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.2
One-point Flow Rate Verification 1/mo
+ 4% of transfer standard
+5% of flow rate design value
1) 40 CFR Part 50, App.L, Sec 9.2.5, 40 CFR Part 58,
Appendix A Sec 3.2.3 & 3.3.2
2) Recommendation
3) 40 CFR Part 50, App.L, Sec 9.2.5 & 7.4.3.1
Laboratory Activities
Post-sampling Weighing all filters
<10 days from sample end date if shipped at
ambient temp, or
<30 days if shipped below avg ambient (or 4
o
C
or below for avg sampling temps < 4
o
C ) from
sample end date
1, 2 and 3) 40 CFR Part 50 App L Sec 8..3.6
Filter Visual Defect Check
(unexposed)
all filters
Correct type & size and for pinholes, particles or
imperfections
1, 2 and 3) 40 CFR Part 50, App.L Sec 10.2
Filter Conditioning Environment
Equilibration all filters 24 hours minimum 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.5
Temp. Range all filters 24-hr mean 20-23
o
C 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.1
Temp.Control all filters + 2
o
C SD* over 24 hr 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.2
Humidity Range
all filters
24-hr mean 30% - 40% RH or
<5% sampling RH but > 20%RH
1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.3
Humidity Control all filters + 5% SD* over 24 hr. 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.2.4
Pre/post Sampling RH all filters difference in 24-hr means < + 5% RH 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.3.3
Balance all filters located in filter conditioning environment 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.3.2
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 38 of 48
1) Criteria (PM10 Lo-Vol
STP) 2) Frequency 3) Acceptable Range Information /Action
OPERATIONAL EVALUATIONS TABLE PM
10
Lo-Vol Filter Based STP
Field Activities
Sampling Instrument
Individual Flow Rates every 24 hours of op no flow rate excursions > +5% for > 5 min.
1/
1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.1
Filter Temp Sensor every 24 hours of op
no excursions of > 5
o
C lasting longer than 30 min
1/
1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.11.4
Routine Verifications
External Leak Check every 5 sampling events < 80 mL/min (see comment #1)
1) 40 CFR Part 50 App L, Sec 7.4.6.1
2) Method 2-12 Table 8-1
3) 40 CFR Part 50, App.L, Sec 7.4.6.1
Internal Leak Check every 5 sampling events < 80 mL/min
1) 40 CFR Part 50, App.L, Sec 7.4.6.2
2) Method 2-12 Table 8-1
3) 40 CFR Part 50, App.L, Sec 7.4.6.2
One-point Temp Verification 1/mo +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Pressure Verification 1/mo +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Annual Multi-point Verifications/Calibrations
Temperature multi-point
Verification/Calibration
on installation, then 1/yr +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.4
Pressure Verification/Calibration on installation, then 1/yr +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.5
Sampler BP verified against independent standard
verified against a lab primary standard that is certified as
NIST traceable 1/year
Flow Rate Multi-point
Verification/ Calibration
Electromechanical
maintenance or transport or
1/yr
+ 4% of transfer standard
1) 40 CFR Part 50, App.L, Sec 9.2.
2) 40 CFR Part 50, App.L, Sec 9.1.3, Method 2.12
Table 6-1
3) 40 CFR Part 50, App.L, Sec 9.2.5
Design Flow Rate Adjustment
at one-point or multi-point
verification/calibration
+ 2% of design flow rate
1,2 and 3) 40 CFR Part 50, App.L, Sec 9.2.6
Other Monitor Calibrations per manufacturers op manual per manufacturers operating manual 1,2 and 3) Recommendation
Precision
Collocated Samples every 12 days for 15% of sites CV <10% of samples >3 g/m
3
1) and 2) 40 CFR Part 58 App A Sec 3.2.5
3) Recommendation
Accuracy
Temperature Audit 1/yr +2
o
C 1, 2 and 3) Method 2.12 Sec. 10.2.2 & Table 3-1
Pressure Audit 1/yr +10 mm Hg 1, 2 and 3) Method 2.12 Sec. 10.2 & Table 3-1
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 39 of 48
1) Criteria (PM10 Lo-Vol
STP) 2) Frequency 3) Acceptable Range Information /Action
Semi Annual Flow Rate Audit 1/6 mo
+4% of audit standard
+5% of design flow rate
1 and 2) Part 58, App A, Sec 3.3.3
3) Method 2.12 Sec. 10.2.1 & Table 10-1
Monitor Maintenance
Impactor (WINs)
every 5 sampling events
cleaned/changed 1, 2,and 3) Method 2.12 Sec 8.3.1
Very Sharp Cut Cyclone every 30 days cleaned/changed 1,2 and 3) Recommendation
Inlet/downtube Cleaning every 15 sampling events cleaned 1,2 and 3) Method 2.12 Sec 9.3 & 9.4.1
Filter Chamber Cleaning 1/mo cleaned 1, 2 and 3) Method 2.12 Sec 9.3
Circulating Fan Filter Cleaning 1/mo cleaned/changed 1, 2 and 3) Method 2.12 Sec 9.3
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
Laboratory Activities
Filter Checks
Lot Blanks 9 filters per lot less than 15 g change between weighings
1, 2, 3) Recommendation and used to determine filter
stability of the lot of filters received from EPA or
vendor.
Exposure Lot Blanks 3 filters per lot less than 15 g change between weighings
1,2 and 3) Method 2.12 Sec. 7.7
Used for preparing a subset of filters for equilibration
Filter Integrity (exposed) each filter no visual defects 1,2 and 3) Method 2.12 Sec. 7.10
Filter Holding Times
Pre-sampling all filters < 30 days before sampling 1,2 and 3) 40 CFR Part 50, App.L Sec 8.3.5
Lab QC Checks
Field Filter Blank 10% or 1 per weighing session +30 gchange between weighings
1) 40 CFR Part 50, App.L Sec 8.3.7.1
2 and 3) Method 2.12 Sec. 7.7
Lab Filter Blank 10% or 1 per weighing session +15 gchange between weighings
1) 40 CFR Part 50, App.L Sec 8.3.7.2
2 and 3) Method 2.12 Sec. 7.7
Balance Check (working standards) beginning, 10th sample, end <3 g 1,2 and 3) Method 2.12 Sec. 7.9
Duplicate Filter Weighing 1 per weighing session +15 gchange between weighings 1,2 and 3) Method 2.12 Sec 7.11
Microbalance Audit 1/yr
+0.050 mg or manufacturers specs, whichever is
tighter
1,2 and 3) Method 2.12 Sec. 10.2.6
Verification/Calibration
Lab Temperature 1/6 months +2
o
C
1) Method 2.12 Table 3-2
2) Recommendation Table 3-2 suggests every 3 mo.
3) Method 2.12 Table 3-2
Lab Humidity 1/6 months +2%
1) Method 2.12 Table 3-2
2) Recommendation Table 3-2 suggests every 3 mo.
3) Method 2.12 Table 3-2
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 40 of 48
1) Criteria (PM10 Lo-Vol
STP) 2) Frequency 3) Acceptable Range Information /Action
Microbalance Calibration
At installation and prior to
each weighing session
1/yr
Manufacturers specification
1) 40 CFR Part 50, App.L, Sec 8.1
2) 40 CFR Part 50, App.L, Sec 8.1 and Method 2.12
Sec. 7.2
3) NA
Calibration & Check Standards -
Working Mass Stds. (compare to
primary standards)
1/3 mo. 0.025 mg
1, 2 and 3) Method 2.12 Sec 4.3 and 7.3
SYSTEMATIC CRITERIA - PM
10
Lo-Vol Filter Based STP
Sampler/Monitor NA
Meets requirements listed in FRM/FEM/ARM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Data Completeness 24- Hour Standard > 75% scheduled sampling days in each quarter 1,2 and 3) 40 CFR Part 50 App. K, sec. 2.3b
Reporting Units all filters g/m
3
at standard temperature and pressure
1,2 and 3) 40 CFR Part 50 App K sec. 1
Rounding convention for data
reported to AQS
Each routine concentration
nearest 10 g/m
3
(> 5 round up)
1,2 and 3) 40 CFR Part 50 App K sec 1
Detection Limit
Lower DL all filters 2 g/m
3
1,2 and 3) 40 CFR Part 50, App.L Sec 3.1
Upper Conc. Limit all filters 200 g/m
3
1,2 and 3) 40 CFR Part 50, App.L Sec 3.2
Precision
Single analyzer 1/3 mo. Coefficient of variation (CV) <10% >3 g/m
3
1,2 and 3) Recommendation
Single analyzer 1/ yr CV <10% >3 g/m
3
1,2 and 3) Recommendation
Primary Quality Assurance Org. Annual and 3 year estimates 90% CL of CV < 10% >3 g/m
3
1,2 and 3) Recommendation
Field Activities
Verification/Calibration Standards Recertifications All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr + 2% of NIST Traceable Std.
1) 40 CFR Part 50, App.L Sec 9.1 & 9.2
2) Method 2-12 Section 6.3.3 and Table 3-1
3) 40 CFR Part 50, App.L Sec 9.1 & 9.2
Field Thermometer 1/yr +0.1
o
C resolution, +0.5
o
C accuracy 1, 2 and 3) Method 2.12 Sec 4.2.2 & Table 3-1
Field Barometer 1/yr +1 mm Hg resolution, +5 mm Hg accuracy 1, 2 and 3) Method 2.12 Sec 4.2.2 & Table 3-1
Clock/timer Verification 1/mo 1 min/mo
1and 2) Method 2.12 Table 3-1
3) 40 CFR Part 50, App.L Sec 7.4.12
Laboratory Activities
Microbalance Readability at purchase 1 g 1, 2 and 3) 40 CFR Part 50, App.L Sec 8.1
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 41 of 48
1) Criteria (PM10 Lo-Vol
STP) 2) Frequency 3) Acceptable Range Information /Action
Microbalance Repeatability 1/yr 1 g
1) Method 2.12 Sec 4.3.6
2) Recommendation
3) Method 2.12 Sec 4.3.6
Primary Mass.
Verification/CalibrationStandards
Recertifications
1/yr 0.025 mg
1, 2 and 3) Method 2.12 Sec 4.3.7 & Table 3-2
Comment #1
Its stated in the CFR that the criteria is <80mL/min. Exactly what samplers use this unit of measure? Most, if not all samplers that I know of use either the in Hg or the mmHg unit.
How can you convert a liquid unit of measure to a pressure unit of measure? Is there any way to change or add more applicable units to ease the confusion? The following is in the
PM2.4 PEP SOP. To pass the test, the actively displayed differential system pressure (shown on the right side of the screen as SP) must not drop by more than 5-cm of water during the
2-minute time interval (or 10-cm of water if using a 10-minute time interval). This is equivalent to the 80 mL/min acceptance criteria stated in related QA documents.
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 42 of 48
Pb High Volume (TSP) Validation Template
Note: in 2008, the NAAQS was lowered for Pb and new monitoring rules were promulgated which allowed for the use of federal equivalent analytical methods and the
use of PM
10
sampling in certain circumstances. The following information is guidance based on the current FRM which is sampling by TSP and analysis by atomic
absorption. Information is this table is derived from the TSP sampling method in 40 CFR Part 50 App B, and QA Handbook Method 2.2 (1977). The analytical
requirements/guidance is derived from 40 CFR Part 50, App G and QA Handbook Method 2.8 (1981). Monitoring for Pb based on the new NAAQS requirements will
begin in calendar year 2010. Revised and/or additional Pb validation templates will be included in this section (if published before this version of the Handbook)
or posted on AMTIC
1) Criteria 2) Frequency 3) Acceptable Range 4) Information/Action
CRITICAL CRITERIA- Pb in TSP
Field Activities
Filter Holding Times
Sample Recovery all filters ASAP 1, 2 and 3 ) 40 CFR Part 50 App B sec 6.3
Sampling Period all filters
1440 minutes + 60 minutes
midnight to midnight local standard time
1,2 and 3) 40 CFR Part 50 App B sec 8.15
Sampling Instrument
Average Flow Rate every 24 hours of op
1.1-1.70 m
3
/min (varies with instrument) in
actual condition
1, 2 and 3) 40 CFR Part 50 App B sec 8.8
One-point Flow Rate Verification 1/3 mo +7% from transfer standard
1 and 2) 40 CFR Part 58 App A sec 3.3.4.1
3) Method 2.2 sec 2.6
Lab Activities
Filter
Visual Defect Check (unexposed) all filters
Initial backlight inspection- no pinholes or
imperfections.Visual inspection prior to shipping
to analytical lab
1,2 and 3) 40 CFR Part 50 App B sec 8.2
Collection Efficiency all filters 99 % 1,2 and 3) 40 CFR Part 50 App B sec 7.1.4
Pressure Drop Range all filters 42-54 mm Hg 1,2 and 3) 40 CFR Part 50 App B sec 7.1.5
pH all filters 6-10 1,2 and 3) 40 CFR Part 50, App B sec 7.1.6
Pb Content
all filters pre-sampling batch
check
<75 g/filter
1,2 and 3) 40 CFR Part 50, App G sec 6.1.1.1
Method 2.8 sec 6.2.1. More information relative to
whether filters should be corrected for blanks.
Calibration Reproducibility Checks
Beginning, every 10 samples
and end
+ 5% of value predicted by calibration curve
1,2 and 3) 40 CFR Part 50, App G Sec 9.3
May be FEM dependent
Reagent Blank Every analytical batch <LDL 1,2 and 3) Recommendation
Daily Calibration Daily (on day of analysis) until good agreement is obtained among replicates 1,2 and 3) Method 2.8 sec 2.8.5
OPERATIONAL EVALUATIONS TABLE Pb in TSP
Field Activities
Verification/Calibration
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 43 of 48
1) Criteria 2) Frequency 3) Acceptable Range 4) Information/Action
System Leak Check During precalibration check
Visual and Auditory inspection with faceplate
blocked
1, 2 and 3) Recommendation
FR Multi-point
Verification/Calibration
After receipt, after motor
maintenance or failure of 1-
point check and
1/yr
5 points over range of 1.1 to 1.7 m
3
/min
within +5% limits of linearity
1, 2 and 3) Method 2.2 sec 2.6
Precision
Collocated Samples
15% of each method code in
PQAO
Frequency - every 12 days
CV < 20% of samples >0.02 g/m
3
(cutoff value)
1 and 2 ) 40 CFR Part 58 App A sec 3.3.4.3
3) Recommendation for early evaluation of DQOs
Semi Annual Flow Rate Audit 1/6 mo +7% of audit standard
1 and 2) 40 CFR Part 58, App A, sec 3.3.4.1
3) Method 2.2 Table 8.2
Monitor Maintenance
Inlet cleaning 1/3 mo cleaned 1,2 and 3) Recommendation
Motor/housing gaskets ~400 hours Inspected replaced 1, 2 and 3) Method 2.2 sec 7
Blower motor brushes 400-500 Replace 1, 2 and 3) Method 2.2 sec 7
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
NA
Lab Activities
Analysis Audits
6 strips/quarter
3 at each concentration range
10% (percent difference)
1 and 2) 40 CFR Part 58, App A, sec 3.3.4.2
3) Recommendation
Field Filter Blank 1/quarter <LDL 1,2 and 3) Recommendation
Lab Blanks 1/ sample run <LDL 1,2 and 3) Recommendation
Control Standards ( 1 g Pb/ml and
a standard between 1-10 g Pb/ml)
1
st
, every 10 samples and last
sample.
Deviation of <5% from value predicted by
calibration curve
1,2 and 3) Method 2.8 section 5.7.3
SYSTEMATIC CRITERIA - Pb Filter Based Hi-Vol
Sampler/Monitor NA
Meets requirements listed in FRM/FEM/ARM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Also described in 40 CFR Part 50 App B sec 7.2
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
Data Completeness 3-year standard
average of the 3 constituent monthly means
> 75% .
1,2 and 3) 40 CFR Part 50 App. R, sec. 4. In addition
there are substitution tests that can be used for data not
meeting completeness criteria.
Reporting Units all filters g/m
3
at local temperature and pressure.
1,2 and 3) 40 CFR Part 50 App R sec 3 (b)
Rounding convention for data
reported to AQS (3-month
arithmetic mean)
quarterly
Report data to 3 decimal places (data after 3 are
truncated.
1,2 and 3) 40 CFR Part 50 App R sec 3 (b)
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 44 of 48
1) Criteria 2) Frequency 3) Acceptable Range 4) Information/Action
Lower Detectable Limit (AA) all samples 0.07 g Pb/m
3
1,2 and 3) 40 CFR Part 50 App G sec 2.3
Precision
Single analyzer 1/3 mo. Coefficient of variation (CV) <20% >0.02 g/m
3
1 and 2 ) 40 CFR Part 58 App A sec 3.3.4.3
3) Recommendation related to DQO
Primary Quality Assurance Org. Annual and 3 year estimates 90% CL of CV < 20% > 0.02 g/m
3
1, 2 and 3 ) 40 CFR Part 58 App A sec 3.3.4.3 and sec
2.3.1.4
Bias
Performance Evaluation Program
(PEP)
5 audits for PQAOs with < 5
sites
8 audits for PQAOs with > 5
sites
95% CL Absolute bias +15% > 0.02 g/m
3
1, 2 and 3 ) 40 CFR Part 58 App A sec 3.3.4.4 and sec
2.3.1.4
The PEP include 1 or independent collocated audits and 4
or 6 samples from the monitoring organizations collocated
monitor sent to the independent National PEP Laboratory.
Field Activities
Verification/Calibration Standards and Recertifications - All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr
Resolution 0.02 m
3
/min
+ 2% reproducibility
1) 40 CFR Part 50, App.B sec 7.8
2) Method 2.2 section 2.5
3) 40 CFR Part 50, App.B sec 7.8
Field Thermometer 1/yr 2
o
C resolution
1) 40 CFR Part 50, App.B sec 7.5
2) Recommendation
3) 40 CFR Part 50, App.B sec 7.5
Field Barometer 1/yr + 5 mm Hg resolution
1) 40 CFR Part 50, App.B sec 7.6
2) Recommendation
3) 40 CFR Part 50, App.B sec 7.6
Clock/timer Verification 1/3 mo. +2 min/24-hour R1,2 and 3) Method 2.2. section 2.3
Lab Activities
Analytical Standards
Reagents (HNO
3
and HCL) all ACS reagent grade 1, 2 and 3) 40 CFR Part 50 App G sec.6.2.1
Pb nitrate Pb (NO
3
)
2
all ACS reagent grade (99.0% purity) 1, 2 and 3) 40 CFR Part 50 App G sec.6.2.8
SD= standard deviation
CV= coefficient of variation
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 45 of 48
PM
10
-Pb Low Volume Filter-Based Local Conditions Validation Template
NOTE: The following validation template was constructed for use of PM
10
-Pb at local conditions where PM
10c
method in 40 CFR Part 50 Appendix O is referenced.
Although the PM
10-2.5
method is found in 40 CFR Part 50 Appendix O, Appendix O also references Appendix L (the PM
2.5
Method) for the QC requirements listed
below. Therefore, the information action column, in most cases, will reference 40 CFR Part 50 App L. In addition, since the PM10 samplers are very similar to the
PM2.5 samplers, Guidance Document 2.12 Monitoring PM2.5 in Ambient Air Using Designated Reference or Class 1 Equivalent Methods is referred to where
appropriate. At present the only analytical FRM is XRF. Therefore quality control criteria are associated with the XRF method which is promulgated in 40 CFR Part
50 Appendix Q.
1) Criteria (PM10-Pb Lo-
Vol ) 2) Frequency 3) Acceptable Range Information /Action
CRITICAL CRITERIA- PM10-Pb Filter Based Local Conditions
Field Activities
Filter Holding Times
Sample Recovery
all filters ASAP
1, 2 and 3 ) 40 CFR part 50 App B sec 6.3
If filters are used for more than one purpose (i.e.,Pb
and PM10) the sample recovery is dictated by the
most stringent requirement.
Sampling Period (including
multiple power failures)
all filters
1440 minutes + 60 minutes
midnight to midnight local standard time
1,2 and 3) 40 CFR Part 50 App B sec 8.15
If filters are used for more than one purpose (i.e.,Pb
and PM10) the sample recovery is dictated by the
most stringent requirement.
Sampling Instrument
Average Flow Rate every 24 hours of op average within 5% of 16.67 liters/minute
1, 2 and 3) 40 CFR Part 50 App L Sec 7.4.3.1
Variability in Flow Rate every 24 hours of op CV < 2% 1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.2
One-point Flow Rate Verification 1/mo
+ 4% of transfer standard
+5% of flow rate design value
1) 40 CFR Part 50, App.L, Sec 9.2.5, 40 CFR Part
58, Appendix A Sec 3.2.3 & 3.3.2
2) Recommendation
3) 40 CFR Part 50, App.L, Sec 9.2.5
Laboratory Activities(XRF Analysis)
Filter Visual Defect Check
(unexposed)
all filters
Correct type & size and for pinholes, particles or
imperfections
1, 2 and 3) 40 CFR Part 50, App.L Sec 10.2
Pb blank filter Acceptance Testing ~ 20 test filters per lot 90% of filters < 4.8 ng Pb/cm
2
1, 2 and 3) 40 CFR Part 50 App Q Sec 6.1.2
OPERATIONAL EVALUATIONS TABLE- PM10-Pb Filter Based Local Conditions
Field Activities
Sampling Instrument
Individual Flow Rates every 24 hours of op no flow rate excursions > +5% for > 5 min.
1/
1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4.3.1
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 46 of 48
1) Criteria (PM10-Pb Lo-
Vol ) 2) Frequency 3) Acceptable Range Information /Action
Filter Temp Sensor every 24 hours of op
no excursions of > 5
o
C lasting longer than 30 min
1/
1, 2 and 3) 40 CFR Part 50, App.L Sec 7.4
Routine Verifications
External Leak Check every 5 sampling events < 80 mL/min (see comment #1)
1) 40 CFR Part 50 App L, Sec 7.4.6.1
2) Method 2-12 Table 8-1
3) 40 CFR Part 50, App.L, Sec 7.4.6.1
Internal Leak Check every 5 sampling events < 80 mL/min
1) 40 CFR Part 50, App.L, Sec 7.4.6.2
2) Method 2-12 Table 8-1
3) 40 CFR Part 50, App.L, Sec 7.4.6.2
One-point Temp Verification 1/mo +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Pressure Verification 1/mo +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2) Method 2.12 Table 6-1
3) Recommendation
Annual Multi-point Verifications/Calibrations
Temperature multi-point
Verification/Calibration
on installation, then 1/yr +2
o
C
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.4
Pressure Verification/Calibration on installation, then 1/yr +10 mm Hg
1) 40 CFR Part 50, App.L, Sec 9.3
2 and 3) Method 2.12 sec 6.5
Sampler BP verified against independent standard
verified against a lab primary standard that is
certified as NIST traceable 1/year
Flow Rate Multi-point
Verification/ Calibration
Electromechanical
maintenance or transport or
1/yr
+ 4% of transfer standard
1) 40 CFR Part 50, App.L, Sec 9.2.
2) 40 CFR Part 50, App.L, Sec 9.1.3, Method 2.12
Table 6-1
3) 40 CFR Part 50, App.L, Sec 9.2.5
Design Flow Rate Adjustment
at one-point or multi-point
verification/calibration
+ 2% of design flow rate
1,2 and 3) 40 CFR Part 50, App.L, Sec 9.2.2
Other Monitor Calibrations per manufacturers op manual per manufacturers operating manual 1,2 and 3) Recommendation
Precision
Collocated Samples
15% of each method code in
PQAO
Frequency - every 12 days
CV < 20% of samples >0.02 g/m
3
(cutoff value)
1 and 2 ) 40 CFR Part 58 App A sec 3.3.4.3
3) Recommendation for early evaluation of DQOs
Accuracy
Temperature Audit 1/yr +2
o
C 1, 2 and 3) Method 2.12 Sec. 10.2.2 & Table 3-1
Pressure Audit 1/yr +10 mm Hg 1, 2 and 3) Method 2.12 Sec. 10.2.3 & Table 3-1
Semi Annual Flow Rate Audit 1/6 mo
+4% of audit standard
+5% of design flow rate
1 and 2) 40 CFR Part 58 App A, Sec 3.3.3
3) Method 2.12 Sec. 10.2.1 & Table 10-1
Monitor Maintenance
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 47 of 48
1) Criteria (PM10-Pb Lo-
Vol ) 2) Frequency 3) Acceptable Range Information /Action
Impactor (WINs)
every 5 sampling events
cleaned/changed
1, 2,and 3) Method 2.12 Sec 8.3.1
Very Sharp Cut Cyclone every 30 days cleaned/changed 1,2 and 3) Recommendation
Inlet/downtube Cleaning every 15 sampling events cleaned 1,2 and 3) Method 2.12 Sec 9.3 & 9.4.1
Filter Chamber Cleaning 1/mo cleaned 1, 2 and 3) Method 2.12 Sec 9.3
Circulating Fan Filter Cleaning 1/mo cleaned/changed 1, 2 and 3) Method 2.12 Sec 9.3
Manufacturer-Recommended
Maintenance
per manufacturers SOP per manufacturers SOP
Laboratory Activities (XRF Analysis)
Filter Holding Times
Pre-sampling
all filters < 30 days before sampling
1,2 and 3) 40 CFR Part 50, App.L Sec 8.3.5
Required only if filters will be used for PM10c
mass as well as Pb. If only used for Pb then 30 day
pre-sampling holding time not required
Analysis Audits
6 filters/quarter
3 at each concentration range
10% (percent difference)
1 and 2) 40 CFR Part 58, App A, sec 3.3.4.2
3) Recommendation
Field Filter Blank 1/quarter <0.01 g/m
3
1) 40 CFR Part 50 App Q sec 6.1.2.1
2 and 3) Recommendation
Lab Filter Blank 1/ sample run <.003 g/m
3
1 40 CFR part 50 App Q sec 6.1.2.1
2 and 3) Recommendation
Thin Film Standards (standard
reference materials)
Beginning and end of each
analytical run
XRF conc. +3x the 1 sigma uncertainty overlaps
the NIST certified conc. +1x its reported
uncertainty.
1) 40 CFR Part 50 App Q sec 6.2.3
2 and 3) recommendation
Run time quality control standards
Checking peak areas, background
areas, centroid and FWHM
Beginning and end of each
analytical run
Target value +3 SD
1,2,and 3) Recommendation
Target values and SD of QC samples established
prior to analysis.
XRF analyzer calibration
1/year or when significant
repairs or changes occur or
QC limits exceeded
XRF conc. +3x the 1 sigma uncertainty overlaps
the NIST certified conc. +1x its reported
uncertainty.
1 and 2) 40 CFR Part 50 App Q sec 6.2.4
3) Recommendation
Background Measurement and
Correction
20 clean blank filters
for each filter lot used
NA
1 and 2) 40 CFR Part 50 App Q sec 6.2.4.2
SYSTEMATIC CRITERIA - PM10-Pb Filter Based Local Conditions
Sampler/Monitor NA
Meets requirements listed in FRM/FEM
designation
1) 40 CFR Part 58 App C Section 2.1
2) NA
3) 40 CFR Part 53 & FRM/FEM method list
Siting 1/year Meets siting criteria or waiver documented
1) 40 CFR Part 58 App E, sections 2-5
2) Recommendation
3) 40 CFR Part 58 App E, sections 2-5
QA Handbook Volume II, Appendix D
Revision No. 0
Date:05/13
Page 48 of 48
1) Criteria (PM10-Pb Lo-
Vol ) 2) Frequency 3) Acceptable Range Information /Action
Data Completeness 3-year standard
average of the 3 constituent monthly means
> 75% .
1,2 and 3) 40 CFR Part 50 App. R, sec. 4. In
addition there are substitution tests that can be used
for data not meeting completeness criteria.
Reporting Units all filters g/m
3
at local temperature and pressure.
1,2 and 3) 40 CFR Part 50 App R sec 3 (b)
Rounding convention for data
reported to AQS (3-monthmean)
quarterly
Report data to 3 decimal places (data after 3 are
truncated.
1,2 and 3) 40 CFR Part 50 App R sec 3 (b)
Lower DL all filters <0.001 g/m
3
1,2 and 3) 40 CFR Part 50 App Q Sec 2.2
Upper Conc. Limit all filters >200 g/m
3
1,2 and 3) 40 CFR Part 50, App.Q Sec 3.1
Precision
Single analyzer 1/3 mo. Coefficient of variation (CV) <20% >0.02 g/m
3
1 and 2 ) 40 CFR Part 58 App A sec 3.3.4.3
3) Recommendation related to DQO
Primary Quality Assurance Org. Annual and 3 year estimates 90% CL of CV < 20% > 0.02 g/m
3
1, 2 and 3 ) 40 CFR Part 58 App A sec 3.3.4.3 and
sec 2.3.1.4
Bias
Performance Evaluation Program
(PEP)
5 audits for PQAOs with < 5
sites
8 audits for PQAOs with > 5
sites
95% CL Absolute bias +15% > 0.02 g/m
3
1, 2 and 3 ) 40 CFR Part 58 App A sec 3.3.4.4
and sec 2.3.1.4
The PEP includes 1 or 2 independent collocated
audits and 4 or 6 samples from the monitoring
organizations collocated monitor sent to the
independent National PEP Laboratory.
Field Activities
Verification/Calibration Standards Recertifications All standards should have multi-point certifications against NIST Traceable standards
Flow Rate Transfer Std. 1/yr + 2% of NIST-traceable Std.
1) 40 CFR Part 50, App.L Sec 9.1 & 9.2
2) Method 2-12 Section 6.3.3 and Table 3-1
3) 40 CFR Part 50, App.L Sec 9.1 & 9.2
Field Thermometer 1/yr +0.1
o
C resolution, +0.5
o
C accuracy 1, 2 and 3) Method 2.12 Sec 4.2.2 & Table 3-1
Field Barometer 1/yr +1 mm Hg resolution, +5 mm Hg accuracy 1, 2 and 3) Method 2.12 Sec 4.2.2 & Table 3-1
Verification/Calibration
Clock/timer Verification
1/mo 1 min/mo
1 and 2) Method 2.12 Table 3-1
3) 40 CFR Part 50, App.L, Sec 7.4.12
Comment #1
Its stated in the CFR that the criteria is <80mL/min. Exactly what samplers use this unit of measure? Most, if not all samplers that I know of use either the in Hg or the mmHg
unit. How can you convert a liquid unit of measure to a pressure unit of measure? Is there any way to change or add more applicable units to ease the confusion? The following
is in the PM2.4 PEP SOP. To pass the test, the actively displayed differential system pressure (shown on the right side of the screen as SP) must not drop by more than 5-cm of
water during the 2-minute time interval (or 10-cm of water if using a 10-minute time interval). This is equivalent to the 80 mL/min acceptance criteria stated in related QA
documents.
1/ value must be flagged SD= standard deviation CV= coefficient of variation
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 1 of 9
Appendix E
Characteristics of Spatial Scales Related to Each Pollutant
The following tables provide information in order to match the spatial scale represented by the monitor
with the monitoring objectives.
NOTE: This information can also be found in 40 CFR Part 58, Appendix D and since there is a
possibility that spatial scales have been updated, users should also review CFR.
http://www.ecfr.gov/cgi-bin/text-idx?tpl=/ecfrbrowse/Title40/40tab_02.tpl
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 2 of 9
Page intentionally left blank
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 3 of 9
Pollutant Spatial
Scale
Characteristics NOTE: This information can also be found in 40 CFR Part 58, Appendix D and since
there is a possibility that spatial scales have been updated, users should also review CFR.
NCore Urban
Rural
Generally located at urban or neighborhood scale to provide representative concentrations of exposure expected throughout the metropolitan area;
however, a middle-scale site may be acceptable in cases where the site can represent many such locations throughout a metropolitan area.
Rural NCore stations are to be located to the maximum extent practicable at a regional or larger scale away from any large local emission source, so
that they represent ambient concentrations over an extensive area.
PM
10
Micro
Middle
Neighborhood
This scale would typify areas such as downtown street canyons, traffic corridors, and fence line stationary source monitoring locations where the
general public could be exposed to maximum PM10 concentrations. Microscale particulate matter sites should be located near inhabited buildings or
locations where the general public can be expected to be exposed to the concentration measured. Emissions from stationary sources such as primary
and secondary smelters, power plants, and other large industrial processes may, under certain plume conditions, likewise result in high ground level
concentrations at the microscale. In the latter case, the microscale would represent an area impacted by the plume with dimensions extending up to
approximately 100 meters. Data collected at microscale sites provide information for evaluating and developing hot spot control measures.
Much of the short-term public exposure to coarse fraction particles (PM10) is on this scale and on the neighborhood scale. People moving through
downtown areas or living near major roadways or stationary sources, may encounter particulate pollution that would be adequately characterized by
measurements of this spatial scale. Middle scale PM10 measurements can be appropriate for the evaluation of possible short-termexposure public
health effects. In many situations, monitoring sites that are representative of micro-scale or middle-scale impacts are not unique and are
representative of many similar situations. This can occur along traffic corridors or other locations in a residential district. In this case, one location is
representative of a neighborhood of small scale sites and is appropriate for evaluation of long-term or chronic effects. This scale also includes the
characteristic concentrations for other areas with dimensions of a few hundred meters such as the parking lot and feeder streets associated with
shopping centers, stadia, and office buildings. In the case of PM10, unpaved or seldomly swept parking lots associated with these sources could be an
important source.
Measurements in this category represent conditions throughout some reasonably homogeneous urban subregion with dimensions of a few kilometers
and of generally more regular shape than the middle scale. Homogeneity refers to the particulate matter concentrations, as well as the land use and
land surface characteristics. In some cases, a location carefully chosen to provide neighborhood scale data would represent not only the immediate
neighborhood but also neighborhoods of the same type in other parts of the city. Neighborhood scale PM10 sites provide information about trends
and compliance with standards because they often represent conditions in areas where people commonly live and work for extended periods.
Neighborhood scale data could provide valuable information for developing, testing, and revising models that describe the largerscale concentration
patterns, especially those models relying on spatially smoothed emission fields for inputs. The neighborhood scale measurements could also be used
for neighborhood comparisons within or between cities.
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 4 of 9
Pollutant Spatial
Scale
Characteristics NOTE: This information can also be found in 40 CFR Part 58, Appendix D and since
there is a possibility that spatial scales have been updated, users should also review CFR.
SO
2
Micro
Middle
Neighborhood
Urban
This scale would typify areas in close proximity to SO
2
point and area sources. Emissions from stationary point and area sources, and non-road
sources may, under certain plume conditions, result in high ground level concentrations at the microscale. The microscale typically represents an area
impacted by the plume with dimensions extending up to approximately 100 meters.
This scale generally represents air quality levels in areas up to several city blocks in size with dimensions on the order of approximately 100 meters
to 500 meters. The middle scale may include locations of expected maximum short-term concentrations due to proximity to major SO
2
point, area,
and/or non-road sources.
The neighborhood scale would characterize air quality conditions throughout some relatively uniform land use areas with dimensions in the 0.5 to 4.0
kilometer range. Emissions from stationary point and area sources may, under certain plume conditions, result in high SO
2
concentrations at the
neighborhood scale. Where a neighborhood site is located away from immediate SO
2
sources, the site may be useful in representing typical air
quality values for a larger residential area, and therefore suitable for population exposure and trends analyses
Measurements in this scale would be used to estimate concentrations over large portions of an urban area with dimensions from 4 to 50 kilometers.
Such measurements would be useful for assessing trends in area-wide air quality, and hence, the effectiveness of large scale air pollution control
strategies. Urban scale sites may also support other monitoring objectives of the SO
2
monitoring network such as identifying trends, and when
monitors are sited upwind of local sources, background concentrations.
CO Micro
Middle
Neighborhood
This scale applies when air quality measurements are to be used to represent distributions within street canyons, over sidewalks, and near major
roadways. In the case with carbon monoxide, microscale measurements in one location can often be considered as representative of other similar
locations in a city.
Middle scale measurements are intended to represent areas with dimensions from 100 meters to 0.5 kilometer. In certain cases, middle scale
measurements may apply to areas that have a total length of several kilometers, such as line emission source areas. This type of emission sources
areas would include air quality along a commercially developed street or shopping plaza, freeway corridors, parking lots and feeder streets
Neighborhood scale measurements are intended to represent areas with dimensions from 0.5 kilometers to 4 kilometers. Measurements of CO in this
category would represent conditions throughout some reasonably urban sub-regions. In some cases, neighborhood scale data may represent not only
the immediate neighborhood spatial area, but also other similar such areas across the larger urban area. Neighborhood scale measurements provide
relative area-wide concentration data which are useful for providing relative urban background concentrations, supporting health and scientific
research, and for use in modeling.
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 5 of 9
Pollutant Spatial
Scale
Characteristics NOTE: This information can also be found in 40 CFR Part 58, Appendix D and since
there is a possibility that spatial scales have been updated, users should also review CFR.
O
3
Neighborhood
Urban
Regional
Measurements in this category represent conditions throughout some reasonably homogeneous urban subregion, with dimensions of a few
kilometers. Homogeneity refers to pollutant concentrations. Neighborhood scale data will provide valuable information for developing, testing, and
revising concepts and models that describe urban/regional concentration patterns. These data will be useful to the understanding and definition of
processes that take periods of hours to occur and hence involve considerable mixing and transport. Under stagnation conditions, a site located in the
neighborhood scale may also experience peak concentration levels within a metropolitan area.
Measurement in this scale will be used to estimate concentrations over large portions of an urban area with dimensions of several kilometers to 50 or
more kilometers. Such measurements will be used for determining trends, and designing area-wide control strategies. The urban scale sites would
also be used to measure high concentrations downwind of the area having the highest precursor emissions.
This scale of measurement will be used to typify concentrations over large portions of a metropolitan area and even larger areas with dimensions of
as much as hundreds of kilometers. Such measurements will be useful for assessing the O3 that is transported to and from a metropolitan area, as well
as background concentrations. In some situations, particularly when considering very large metropolitan areas with complex source mixtures,
regional scale sites can be the maximum concentration location.
NO
2
Microscale
Middle
Neighborhood
Urban
This scale represents areas in close proximity to major roadways or point and area sources. Emissions from roadways result in high ground level NO
2
concentrations at the microscale, where concentration gradients generally exhibit a marked decrease with increasing downwind distance from major
roads. As noted in appendix E of this part, near-road NO
2
monitoring stations are required to be within 50 meters of target road segments in order to
measure expected peak concentrations. Emissions from stationary point and area sources, and non-road sources may, under certain plume conditions,
result in high ground level concentrations at the microscale. The microscale typically represents an area impacted by the plume with dimensions
extending up to approximately 100 meters.
Dimensions from about 100 meters to 500 meters. The middle scale may include locations of expected maximum hourly concentrations due to
proximity to major NO
2
point, area, and/or non-road sources.
The neighborhood scale represents air quality conditions throughout some relatively uniform land use areas with dimensions in the 0.5 to 4.0
kilometer range.
Measurements in this scale would be used to estimate concentrations over large portions of an urban area with dimensions from 4 to 50 kilometers.
Such measurements would be useful for assessing trends in area-wide air quality, and hence, the effectiveness of large scale air pollution control
strategies
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 6 of 9
Pollutant Spatial
Scale
Characteristics NOTE: This information can also be found in 40 CFR Part 58, Appendix D and since
there is a possibility that spatial scales have been updated, users should also review CFR.
PM2.5 Microscale
Middle
Neighborhood
Urban
Regional
Areas such as downtown street canyons and traffic corridors where the general public would be exposed to maximum concentrations from mobile
sources. In some circumstances, the microscale is appropriate for particulate sites; community-oriented SLAMS sites measured at the microscale
level should, however, be limited to urban sites that are representative of long-term human exposure and of many such microenvironments in the
area. In general, microscale particulate matter sites should be located near inhabited buildings or locations where the general public can be expected
to be exposed to the concentration measured. Emissions from stationary sources such as primary and secondary smelters, power plants, and other
large industrial processes may, under certain plume conditions, likewise result in high ground level concentrations at the microscale. In the latter
case, the microscale would represent an area impacted by the plume with dimensions extending up to approximately 100 meters. Data collected at
microscale sites provide information for evaluating and developing hot spot control measures.
People moving through downtown areas, or living near major roadways, encounter particle concentrations that would be adequately characterized by
this spatial scale. Thus, measurements of this type would be appropriate for the evaluation of possible short-term exposure public health effects of
particulate matter pollution. In many situations, monitoring sites that are representative of microscale or middle-scale impacts are not unique and are
representative of many similar situations. This can occur along traffic corridors or other locations in a residential district. In this case, one location is
representative of a number of small scale sites and is appropriate for evaluation of long-term or chronic effects. This scale also includes the
characteristic concentrations for other areas with dimensions of a few hundred meters such as the parking lot and feeder streets associated with
shopping centers, stadia, and office buildings.
Measurements in this category would represent conditions throughout some reasonably homogeneous urban sub-region with dimensions of a few
kilometers and of generally more regular shape than the middle scale. Homogeneity refers to the particulate matter concentrations, as well as the land
use and land surface characteristics. Much of the PM2.5 exposures are expected to be associated with this scale of measurement. In some cases, a
location carefully chosen to provide neighborhood scale data would represent the immediate neighborhood as well as neighborhoods of the same type
in other parts of the city. PM2.5 sites of this kind provide good information about trends and compliance with standards because they often represent
conditions in areas where people commonly live and work for periods comparable to those specified in the NAAQS. In general, most PM2.5
monitoring in urban areas should have this scale.
This class of measurement would be used to characterize the particulate matter concentration over an entire metropolitan or rural area ranging in
size from 4 to 50 kilometers. Such measurements would be useful for assessing trends in area-wide air quality, and hence, the effectiveness of large
scale air pollution control strategies. Community-oriented PM2.5 sites may have this scale.
These measurements would characterize conditions over areas with dimensions of as much as hundreds of kilometers. As noted earlier, using
representative conditions for an area implies some degree of homogeneity in that area. For this reason, regional scale measurements would be most
applicable to sparsely populated areas. Data characteristics of this scale would provide information about larger scale processes of particulate matter
emissions, losses and transport. PM
2.5
transport contributes to elevated particulate concentrations and may affect multiple urban and State entities
with large populations such as in the eastern United States. Development of effective pollution control strategies requires an understanding at
regional geographical scales of the emission sources and atmospheric processes that are responsible for elevated PM
2.5
levels and may also be
associated with elevated O
3
and regional haze.
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 7 of 9
Pollutant Spatial
Scale
Characteristics NOTE: This information can also be found in 40 CFR Part 58, Appendix D and since
there is a possibility that spatial scales have been updated, users should also review CFR.
Pb Micro
Middle
Neighborhood
This scale would typify areas in close proximity to lead point sources. Emissions from point sources such as primary and secondary lead smelters,
and primary copper smelters may under fumigation conditions likewise result in high ground level concentrations at the microscale. In the latter case,
the microscale would represent an area impacted by the plume with dimensions extending up to approximately 100 meters. Pb monitors in areas
where the public has access, and particularly children have access, are desirable because of the higher sensitivity of children to exposures of elevated
Pb concentrations.
This scale generally represents Pb air quality levels in areas up to several city blocks in size with dimensions on the order of approximately 100
meters to 500 meters. The middle scale may for example, include schools and playgrounds in center city areas which are close to major Pb point
sources. Pb monitors in such areas are desirable because of the higher sensitivity of children to exposures of elevated Pb concentrations. Emissions
from point sources frequently impact on areas at which single sites may be located to measure concentrations representing middle spatial scales.
The neighborhood scale would characterize air quality conditions throughout some relatively uniform land use areas with dimensions in the 0.5 to 4.0
kilometer range. Sites of this scale would provide monitoring data in areas representing conditions where children live and play. Monitoring in such
areas is important since this segment of the population is more susceptible to the effects of Pb. Where a neighborhood site is located away from
immediate Pb sources, the site may be very useful in representing typical air quality values for a larger residential area, and therefore suitable for
population exposure and trends analyses.
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 8 of 9
Pollutant Spatial
Scale
Characteristics NOTE: This information can also be found in 40 CFR Part 58, Appendix D and since
there is a possibility that spatial scales have been updated, users should also review CFR.
PAMs
The PAMS program provides more comprehensive data on O
3
air pollution in areas classified as serious, severe, or extreme nonattainment for O
3
than would otherwise be achieved through the NCore and SLAMS sites. More specifically, the PAMS program includes measurements for O
3
,
oxides of nitrogen, VOC, and meteorology. PAMS design criteria are site specific. Concurrent measurements of O
3
, oxides of nitrogen, speciated
VOC, CO, and meteorology are obtained at PAMS sites. Design criteria for the PAMS network are based on locations relative to O
3
precursor source
areas and predominant wind directions associated with high O
3
events. Specific monitoring objectives are associated with each location. The overall
design should enable characterization of precursor emission sources within the area, transport of O
3
and its precursors, and the photochemical
processes related to O
3
nonattainment. Specific objectives that must be addressed include assessing ambient trends in O
3
, oxides of nitrogen, VOC
species, and determining spatial and diurnal variability of O
3
, oxides of nitrogen, and VOC species. Specific monitoring objectives associated with
each of these sites may result in four distinct site types. Detailed guidance for the locating of these sites may be found in reference 9 of this appendix.
(a) Type 1 sites are established to characterize upwind background and transported O
3
and its precursor concentrations entering the area and will
identify those areas which are subjected to transport.
(b) Type 2 sites are established to monitor the magnitude and type of precursor emissions in the area where maximum precursor emissions are
expected to impact and are suited for the monitoring of urban air toxic pollutants.
(c) Type 3 sites are intended to monitor maximum O
3
concentrations occurring downwind from the area of maximum precursor emissions.
(d) Type 4 sites are established to characterize the downwind transported O
3
and its precursor concentrations exiting the area and will identify those
areas which are potentially contributing to overwhelming transport in other areas.
Minimum Monitoring Network Requirements. A Type 2 site is required for each area. Overall, only two sites are required for each area, providing
all chemical measurements are made. For example, if a design includes two Type 2 sites, then a third site will be necessary to capture the NO
y
measurement. The minimum required number and type of monitoring sites and sampling requirements are listed in Table D-6 of this appendix. Any
alternative plans may be put in place in lieu of these requirements, if approved by the Administrator.
QA Handbook Volume II, Appendix E
Revision No. 0
Date:05/13
Page 9 of 9
Pollutant Spatial
Scale
Characteristics NOTE: This information can also be found in 40 CFR Part 58, Appendix D and since
there is a possibility that spatial scales have been updated, users should also review CFR.
PM
10-2.5
Micro
Middle
Neighborhood
The only required monitors for PM
10-2.5
are those required at NCore Stations. Although microscale monitoring may be appropriate in some
circumstances, middle and neighborhood scale measurements are the most important station classifications for PM
10-2.5
This scale would typify relatively small areas immediately adjacent to: Industrial sources; locations experiencing ongoing construction,
redevelopment, and soil disturbance; and heavily traveled roadways. Data collected at microscale stations would characterize exposure over areas of
limited spatial extent and population exposure, and may provide information useful for evaluating and developing source oriented control measures.
People living or working near major roadways or industrial districts encounter particle concentrations that would be adequately characterized by this
spatial scale. Thus, measurements of this type would be appropriate for the evaluation of public health effects of coarse particle exposure. Monitors
located in populated areas that are nearly adjacent to large industrial point sources of coarse particles provide suitable locations for assessing
maximum population exposure levels and identifying areas of potentially poor air quality. Similarly, monitors located in populated areas that border
dense networks of heavily-traveled traffic are appropriate for assessing the impacts of resuspended road dust. This scale also includes the
characteristic concentrations for other areas with dimensions of a few hundred meters such as school grounds and parks that are nearly adjacent to
major roadways and industrial point sources, locations exhibiting mixed residential and commercial development, and downtown areas featuring
office buildings, shopping centers, and stadiums.
Measurements in this category would represent conditions throughout some reasonably homogeneous urban sub-region with dimensions of a few
kilometers and of generally more regular shape than the middle scale. Homogeneity refers to the particulate matter concentrations, as well as the land
use and land surface characteristics. This category includes suburban neighborhoods dominated by residences that are somewhat distant from major
roadways and industrial districts but still impacted by urban sources, and areas of diverse land use where residences are interspersed with commercial
and industrial neighborhoods. In some cases, a location carefully chosen to provide neighborhood scale data would represent the immediate
neighborhood as well as neighborhoods of the same type in other parts of the city. The comparison of data from middle scale and neighborhood scale
sites would provide valuable information for determining the variation of PM102.5 levels across urban areas and assessing the spatial extent of
elevated concentrations caused by major industrial point sources and heavily traveled roadways. Neighborhood scale sites would provide
concentration data that are relevant to informing a large segment of the population of their exposure levels on a given day.
PM
2.5
Speciation
NA Each State shall continue to conduct chemical speciation monitoring and analyses at sites designated to be part of the PM2.5 Chemical Speciation
Trends Network (CSN). The selection and modification of these CSN sites must be approved by the Administrator.
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 1 of 13
Appendix F
Sample Manifold Design for Precursor Gas Monitoring
The following information is extracted from the document titled: Version 4 of the Technical
Assistance Document for Precursor Gas Measurements in the NCore Multi-pollutant Monitoring Network
which can be found on the AMTIC website at: http://www.epa.gov/ttn/amtic/pretecdoc.html
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 2 of 13
Page intentionally left blank
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 3 of 13
Sample Manifold Design for Precursor Gas Monitoring
Many important variables affect sampling manifold design for ambient precursor gas monitoring:
residence time of sample gases, materials of construction, diameter, length, flow rate, and
pressure drop. Considerations for these parameters are discussed below.
Residence Time Determination: The residence time of air pollutants within the sampling system
(defined as extending from the entrance of the sample inlet above the instrument shelter to the
bulkhead of the precursor gas analyzer) is critical. Residence time is defined as the amount of
time that it takes for a sample of air to travel through the sampling system. This issue is
discussed in detail for NO
y
monitoring in Section 4.2, and recommendations in Section 4 for the
arrangement of the molybdenum converter and inlet system should be followed. However,
residence time is also an issue for other precursor gases, and should be considered in designing
sample manifolds for those species. For example, Code of Federal Regulations (CFR), Title 40
Part 58, Appendix E.9 states, Ozone in the presence of NO will show significant losses even in
the most inert probe material when the residence time exceeds 20 seconds. Other studies indicate
that 10-second or less residence time is easily achievable.
1
Although 20-second residence time
is the maximum allowed as specified in 40 CFR 58, Appendix E, it is recommended that the
residence time within the sampling system be less than 10 seconds. If the volume of the
sampling system does not allow this to occur, then a blower motor or other device (such as a
vacuum pump) can be used to increase flow rate and decrease the residence time. The residence
time for a sample manifold system is determined in the following way. First the total volume of
the cane (inlet), manifold, and sample lines must be determined using the following equation:
Total Volume = Cv + Mv + Lv Equation 1
Where:
Cv = Volume of the sample cane or inlet and extensions
Mv = Volume of the sample manifold and moisture trap
Lv = Volume of the instrument lines from the manifold to the instrument bulkhead
The volume of each component of the sampling system must be measured individually. To
measure the volume of the components (assuming they are cylindrical in shape), use the
following equation:
V = * (d/2)
2
* L Equation 2
Where:
V = volume of the component, cm
3
= 3.14
L = Length of the component, cm
d = inside diameter of the component, cm
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 4 of 13
Once the total volume is determined, divide the total volume by the total sample flow rate of all
instruments to calculate the residence time in the inlet. If the residence time is greater than 20
seconds, attach a blower or vacuum pump to increase the flow rate and decrease the residence
time.
Laminar Flow Manifolds: In the past, vertical laminar flow manifolds were a popular design.
By the proper selection of a large diameter vertical inlet probe and by maintaining a laminar flow
throughout, it was assumed that the sample air would not react with the walls of the probe.
Numerous materials such as glass, plastic, galvanized steel, and stainless steel were used for
constructing the probe. Removable sample lines constructed of FEP or PTFE were placed to
protrude into the manifold to provide each instrument with sample air. A laminar flow manifold
could have a flow rate as high as 150 L/min, in order to minimize any losses, and large diameter
tubing was used to minimize pressure drops. However, experience has shown that vertical
laminar flow manifolds have demonstrated many disadvantages which are listed below:
Since the flow rates are so high, it is difficult to supply enough audit gas to provide an
adequate independent assessment for the entire sampling system;
Long laminar flow manifolds may be difficult to clean due to size and length;
Temperature differentials may exist that could change the characteristics of the gases, e.g., if
a laminar manifolds inlet is on top of a building, the temperature at the bottom of the
building may be much lower, thereby dropping the dew point and condensing water.
Construction of the manifold is frequently of an unapproved material.
For these technical reasons, EPA strongly discourages the use of laminar flow manifolds in the
national air monitoring network. It is recommended that agencies that utilize laminar manifolds
migrate to conventional manifold designs that are described below.
Sampling Lines as Inlet and Manifold: Often air monitoring agencies will place individual
sample lines outside of their shelter for each instrument. If the sample lines are manufactured
out of polytetrafluoroethylene (PTFE), perfluoroalkoxy (PFA) or fluoroethylpropylene (FEP)
Teflon, this is acceptable to the EPA. The advantages to using single sample lines are: no
breakage and ease of external auditing. In addition, rather than cleaning glass manifolds, some
agencies just replace the sampling lines. However, please note the following caveats:
1. lines can deteriorate when exposed to atmospheric conditions, particularly ultraviolet
radiation from the sun. Therefore, it is recommended that sample lines be inspected and
replaced regularly.
2. Small insects and particles can accumulate inside of the tubing. It has been reported that
small spiders build their webs inside of tubing. This can cause blockage and affect the
response of the instruments. In addition, particles can collect inside the tubing, especially at
the entrance, thus affecting precursor gas concentrations. Check the sampling lines and
replace or clean the tubing on a regular basis.
3. Since there is no central manifold, these configurations sometimes have a three-way tee,
i.e., one flow path for supplying calibration mixtures and the other for the sampling of
ambient air. If the three-way tee is not placed near the outermost limit of the sample inlet
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 5 of 13
tubing, then the entire sampling system is not challenged by the provision of calibration gas.
It is strongly recommended that at least on a periodic basis calibration gas be supplied so
that it floods the entire sample line. This is best done by placing the three-way tee just
below the sample inlet, so that calibration gas supplied there is drawn through the entire
sampling line.
4. The calibration gas must be delivered to the analyzers at near ambient pressure. Some
instruments are very sensitive to pressure changes. If the calibration gas flow is excessive,
the analyzer may sample the gas under pressure. If a pressure effect on calibration gas
response is suspected, it is recommended that the gas be introduced at more than one place
in the sampling line (by placement of the tee, as described in item #3 above). If the response
to the calibration gas is the same regardless of delivery point, then there is likely no pressure
effect.
Conventional Manifold Design - A number of conventional manifold systems exist today.
However, one manifold feature must be consistent: the probe and manifold must be constructed
of borosilicate glass or Teflon (PFA or PTFE). These are the only materials proven to be inert
to gases. EPA will accept manifolds or inlets that are made from other materials, such as steel or
aluminum, that are lined or coated with borosilicate glass or the Teflon materials named above.
However, all of the linings, joints and connectors that could possibly come into contact with the
sample gases must be of glass or Teflon. It is recommended that probes and manifolds be
constructed in modular sections to enable frequent cleaning. It has been demonstrated that there
are no significant losses of reactive gas concentrations in conventional 13 mm inside diameter
(ID) sampling lines of glass or Teflon if the sample residence time is 10 seconds or less. This is
true even in sample lines up to 38 m in length. However, when the sample residence time
exceeds 20 seconds, loss is detectable, and at 60 seconds the loss can be nearly complete.
Therefore, EPA requires that residence times must be 20 seconds or less (except for NOy).
Please note that for particulate matter (PM) monitoring instruments, such as nephelometers,
Tapered Element Oscillating Microbalance (TEOM) instruments, or Beta Gauges, the ambient
precursor gas manifold is not recommended. Particle monitoring instruments should have
separate intake probes that are as short and as straight as possible to avoid particulate losses due
to impaction on the walls of the probe.
T-Type Design: The most popular gas sampling system in use today consists of a vertical
"candy cane" protruding through the roof of the shelter with a horizontal sampling manifold
connected by a tee fitting to the vertical section (Figure 1). This type of manifold is
commercially available. At the bottom of the tee is a bottle for collecting particles and moisture
that cannot make the bend; this is known as the drop out or moisture trap bottle. Please note
that a small blower at the exhaust end of the system (optional) is used to provide flow through
the sampling system. There are several issues that must be mitigated with this design:
The probe and manifold may have a volume such that the total draw of the precursor gas
analyzers cannot keep the residence time less than 20 seconds (except NOy), thereby
requiring a blower motor. However, a blower motor may prevent calibration and audit
gases from being supplied in sufficient quantity, because of the high flow rate in the
manifold. To remedy this, the blower motor must be turned off for calibration.
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 6 of 13
However, this may affect the response of the instruments since they are usually operated
with the blower on.
Horizontal manifolds have been known to collect water, especially in humid climates.
Standing water in the manifold can be pulled into the instrument lines. Since most
monitoring shelters are maintained at 20-30
o
C, condensation can occur when warm
humid outside air enters the manifold and is cooled. Station operators must be aware of
this issue and mitigate this situation if it occurs. Tilting the horizontal manifold slightly
and possibly heating the manifold have been used to mitigate the condensation problem.
Water traps should be emptied whenever there is standing water.
Sample Cane
Blower Motor
Teflon Connectors -
Bushing
Modular Manifold
Moisture Trap
roof line
Screw Type Opening
"T"
adaptor
Figure 1. Conventional T-Type Glass Manifold System
California Air Resources Board Octopus Style: Another type of manifold that is being
widely used is known as the California Air Resources Board (CARB) style or Octopus
manifold, illustrated in Figure 2. This manifold has a reduced profile, i.e., there is less volume in
the cane and manifold; therefore, there is less need for a blower motor. If the combined flow
rates of the gas analyzers are high enough, then an additional blower is not required.
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 7 of 13
roof line
Screw Type Opening
Moisture Trap
Sample Cane
Teflon Connectors -
Bushing
8-port "Octopus"
Manifold
Figure 2. CARB or Octopus Style Manifold
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 8 of 13
Placement of Tubing on the Manifold: If the manifold employed at the station has multiple
ports (as in Figure 2) then the position of the instrument lines relative to the calibration input line
can be crucial. If a CARB Octopus or similar manifold is used, it is suggested that sample
connections for analyzers requiring lower flows be placed towards the bottom of the manifold.
Also, the general rule of thumb states that the calibration gas delivery line (if used) should be in
a location so that the calibration gas flows past the analyzer inlet points before the gas is
evacuated out of the manifold. Figure 3 illustrates two potential locations for introduction of the
calibration gas. One is located at the ports on the Octopus manifold, and the other is upstream
near the air inlet point, using an audit or probe inlet stub. This stub is a tee fitting placed so that
Through-the-Probe audit line or sampling system tests and calibrations can be conducted.
roof line
Sample Cane
Audit and probe
inlet stub
Instrument
inlet lines
Calibration
outlet line
Instrument
inlet lines
Figure 3. Placement of Lines on the Manifold
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 9 of 13
3
14
2
8
1
4
15
13
12
12
9
11
7
6
16
5
Measurements and Features
1. Knurled Connector
2. O-ring
3. Threaded opening
4. Top extension - 56 mm
5. Overall Length - 304 mm
6. Outside diameter - 24 mm
7. Top and bottom shoulder - 50 mm
8. Length of inlet tube - 30 mm
9. Distancebetween inlet tubes - 16 mm
10. Length of internal tube - 145 mm
11. Width of inlet tube OD - 6 mm
12. Distance from inner tube to wall - 18 mm
13. Inside width of outer tube 60 mm
14. Down tube length 76 mm
15. Width Down tube OD - 24 mm
16 Overall Width ~ 124 mm
10
7
Figure 4. Specifications for an Octopus Style Manifold
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 10 of 13
Figure 4 illustrates the specifications of an Octopus style manifold. Please note that EPA-
OAQPS has used this style of manifold in its precursor gas analyzer testing program. This type
of manifold is commercially available.
Vertical Manifold Design: Figure 5 shows a schematic of the vertical manifold design.
Commercially available vertical manifolds have been on the market for some time. The issues
with this type of manifold are the same with other conventional manifolds, i.e., when sample air
moves from a warm humid atmosphere into an air-conditioned shelter, condensation of moisture
can occur on the walls of the manifold. Commercially available vertical manifolds have the
option for heated insulation to mitigate this problem. Whether the manifold tubing is made of
glass or Teflon, the heated insulation prevents viewing of the tubing, so the interior must be
inspected often. The same issues apply to this manifold style as with horizontal or Octopus
style manifolds: additional blower motors should not be used if the residence time is less than 20
seconds, and the calibration gas inlet should be placed upstream so that the calibration gas flows
past the analyzer inlets before it exits the manifold.
roof line
Support Pipe
Glass Manifold
Sample Ports
Blower Motor
Insulation
Heater Power Cord
Manifold Support
Exhaust Hose
Floor
"T" Connector
Figure 5. Example of Vertical Design Manifold
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 11 of 13
Manifold/Instrument Line Interface: A sampling system is an integral part of a monitoring
station, however, it is only one part of the whole monitoring process. With the continuing
integration of advanced electronics into monitoring stations, manifold design must be taken into
consideration. Data Acquisition Systems (DASs) are able not only to collect serial and analog
data from the analyzers, but also to control Mass Flow Calibration (MFC) equipment and solid
state solenoid switches, communicate via modem or Ethernet, and monitor conditions such as
shelter temperature and manifold pressure. As described in Chapter 6, commercially available
DASs may implement these features in an electronic data logger, or via software installed on a
personal computer. Utilization of these features allows the DAS and support equipment to
perform automated calibrations (Autocals). In addition to performing these tasks, the DAS can
flag data during calibration periods and allow the data to be stored in separate files that can be
reviewed remotely.
Figure 6 shows a schematic of the integrated monitoring system at EPAs Burden Creek NCore
monitoring station. Note that a series of solenoid switches are positioned between the ambient
air inlet manifold and an additional calibration manifold. This configuration allows the DAS
to control the route from which the analyzers draw their sample. At the beginning of an Autocal,
the DAS signals the MFC unit to come out of standby mode and start producing zero or
calibration gas. Once the MFC has stabilized, the DAS switches the analyzers inlet flow (via
solenoids) from the ambient air manifold to the calibration manifold. The calibration gas is
routed to the instruments, and the DAS monitors and averages the response, flagging the data
appropriately as calibration data. When the Autocal has terminated, the DAS switches the
analyzers inlet flow from the calibration manifold back to the ambient manifold, and the data
system resets the data flag to the normal ambient mode.
The integration of DAS, solenoid switches, and MFC into an automated configuration can bring
an additional level of complexity to the monitoring station. Operators must be aware that this
additional complexity can create situations where leaks can occur. For instance, if a solenoid
switch fails to open, then the inlet flow of an analyzer may not be switched back to the ambient
manifold, but instead will be sampling interior room air. When the calibrations occur, the
instrument will span correctly, but will not return to ambient air sampling. In this case, the data
collected must be invalidated. These problems are usually not discovered until there is an
external Through-the Probe audit, but by then extensive data could be lost. It is recommended
that the operator remove the calibration line from the calibration manifold on a routine basis and
challenge the sampling system from the inlet probe. This test will discover any leak or switching
problems within the entire sampling system.
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 12 of 13
Figure 6. Example of a Manifold/Instrument Interface
Figure 7 shows a close up of an ambient/calibration manifold, illustrating the calibration
manifold ambient manifold interface. This is the same interface used at EPAs Burdens Creek
monitoring station. The interface consists of three distinct portions: the ambient manifold, the
solenoid switching system and the calibration manifold. In this instance, the ambient manifold is
a T-type design that is being utilized with a blower fan at the terminal. Teflon tubing connects
the manifold to the solenoid switching system. Two-way solenoids have two configurations.
Either the solenoid is in its passive state, at which time the ports that are connected are the
normally open (NO) and the common (COM). In the other state, when it is energized, the ports
that are connected are the normally closed (NC) and the COM ports. Depending on whether the
solenoid is active or not, the solenoid routes the air from the calibration or ambient manifold to
the instrument inlets. There are two configurations that can be instituted with this system.
1. Ambient Mode: In this mode the solenoids are in passive state. The flow of air (under
vacuum) is routed from the NO port through the solenoid to the COM port.
2. Calibration Mode: In this mode, the solenoids are in the active state. An external
switching device, usually the DAS, must supply direct current to the solenoid. This
causes the solenoid to be energized so that the NO port is shut and the NC port is now
connected to the COM port. As in all cases, the COM port is always selected. The
switching of the solenoid is done in conjunction with the MFC unit becoming active;
Burden's Creek Sampling Station - OAQPS/MQAG
Environics
9100 Cal Sys
TECO 42CY TL
NOx
TECO 43CTL
SO2
TECO 48CTL
CO
P
V
V
S
S S S
F
V
Other Monitor
(O3 etc)
C
C C C
NC
NC NC NC
NO
NO NO NO
External Moly
Converter Sampling Cane
Manifold Fan
Charcoal
Scrubber
4-ft
Notes:
S - Teflon 3-way Solenoid
P - Pump
F - Manifold Fan/Blower
V - Vent
- Particulate Filter
Sample tubing lengths <3-ft
Cal Standard
(triblend)
Temp/
Pressure/RH
Calibration Manifold
Ambient Sampling Manifold
UPS Power
Supply
Zero Air
Source
Cal
NOy
NO
Sample In
EnviDAS Data
AcquisitionSystem
Analog Inputs -
To Analyzers/
Sensors
Control Outputs
- To Environics
Cal Sys
Control Out- To Solenoids
Modem
Desktop System
QA Handbook Volume II, Appendix F
Revision No: 0
Date: 05/13
Page 13 of 13
generally, the MFC is controlled by the DAS. When the calibration sequences have
finished, the DAS stops the direct current from being sent to the solenoid and switches
automatically back to the NO to COM (inactive) port configuration. This allows the air
to flow through the NO to COM port and the instrument is now back on ambient mode.
Ai r Fl o w t o t h e An al yzer s
NO
NC
NO NO
NC
NC
COM
Calibration
Gas from the Mass
Flow Calibrator Exhaust
Air Flow Air Flow
Air Flow
to
Exhaust
Fan
Ai r Fl ow
Ai r Fl ow
Ai r Fl ow
Figure 7. Ambient Calibration Manifold Interface
Reference
1. Code of Federal Regulations, Title 40, Part 58, Appendix E.9
QA Handbook Vol II, Appendix G
Revision No: 0
Date: 05/13
Page 1 of 3
Appendix G
Example Procedure for Calibrating a Data Acquisition System
QA Handbook Vol II, Appendix G
Revision No: 0
Date: 05/13
Page 2 of 3
This page left blank intentionally
QA Handbook Vol II, Appendix G
Revision No: 0
Date: 05/13
Page 3 of 3
DAS Calibration Technique
The following is an example of a DAS calibration. The DAS owners manual should be
followed. The calibration of a DAS is performed by inputting known voltages into the DAS and
measuring the output of the DAS.
1. The calibration begins by obtaining a voltage source and an ohm/voltmeter.
2. Place a wire lead across the input of the DAS multiplexer. With this "shorted" out, the
DAS should read zero.
3. If the output does not read zero, adjust the output according to the owners manual.
4. After the background zero has been determined, it is time to adjust the full scale of the
system. Most DAS system work on a 1, 5 or 10 volt range, i.e., the full scale equals an
output of voltage. In the case of a 0 - 1000 ppb range instrument, 1.00 volts equals 1000
ppb. Accordingly, 500 ppb equals 0.5 volts (500 milivolts). To get the DAS to be linear
throughout the range of the instrument being measured, the DAS must be tested for
linearity.
5. Attach the voltage source to a voltmeter. Adjust the voltage source to 1.000 volts (this is
critical that the output be 1.000 volts). Attach the output of the voltage source the DAS
multiplexer. The DAS should read 1000 ppb. Adjust the DAS voltage A/D card
accordingly. Adjust the output of the voltage source to 0.250 volts. The DAS output
should read 250 ppb. Adjust the A/D card in the DAS accordingly. Once you have
adjusted in the lower range of the DAS, check the full scale point. With the voltage
source at 1.000 volts, the output should be 1000 ppb. If it isn't, then adjust the DAS to
allow the high and low points to be as close to the source voltage as possible. In some
cases, the linearity of the DAS may be in question. If this occurs, the data collected may
need to be adjusted using a linear regression equation. See Section 2.0.9 for details on
data adjustment. The critical range for many instruments is in the lower 10 % of the
scale. It is critical that this be linear.
6. Every channel on a DAS should be calibrated. In some newer DAS systems, there is only
one A/D card voltage adjustment which is carried throughout the multiplexer. This
usually will adjust all channels. It is recommended that DAS be calibrated once per year.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 1 of 48
United States Environmental Protection Agency
National Ambient Air Monitoring Technical Systems Audit Checklist
This version attached is very similar to the checklist in the 2008 QA Handbook. It is an example that
has been modified for use in EPA Region 5.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 2 of 48
Page intentionally left blank
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 3 of 48
Table of Contents
1) General / Quality Management
a) Program Organization
b) Facilities
c) Independent Quality Assurance and Quality Control
d) Planning Documents (including QMP, QAPPs, & SOPs)
e) General Documentation Policies
f) Training
g) Corrective Action
h) Quality Improvement
i) External Performance Audits
2) Network Management / Field Operations
a) Network Design
b) Changes to the Network since the last audit
c) Proposed changes to the Network
d) Field Support
i) SOPs
ii) Instrument Acceptance
iii) Calibration
iv) Repair
v) Record Keeping
vi) Site and Monitor Information Form
3) Laboratory Operations
a) Routine Operations
b) Quality Control
c) Laboratory Preventive Maintenance
d) Laboratory Record Keeping
e) Laboratory Data Acquisition and Handling
f) Specific Pollutants: PM10,PM 2.5 and Lead
4) Data and Data Management
a) Data Handling
b) Software Documentation
c) Data Validation and Correction
d) Data Processing
e) Internal Reporting
f) External Reporting
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 4 of 48
a) Program Organization
Key Individuals
Agency Director:
Ambient Air Monitoring (AAM) Network Manager:
Quality Assurance Manager:
QA Auditors:
Field Operations Supervisor / Lead:
Laboratory Supervisor:
QA Laboratory Manager:
Data Management Supervisor / Lead:
1) General / Quality Management
State/ Local / Tribal Agency Audited:
Address:
City, State, and Zip Code:
Date of Technical System Audit:
Auditor / Agency:
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 5 of 48
Attach an Organizational Chart:
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 6 of 48
Comment on the need for additional personnel if applicable.
List personnel who have authority or are responsible for:
Activity Name Title
QA Training Field/Lab
Grant Management
Purchases greater than $500
Equipment and Service Contract Management
Staff appointment
Flow Chart:
Key position staffing. Number of personnel available to each of the following program areas:
Program Area Number of People
Primary
Number of People
Backup
Vacancies Program Area Number of
People
Primary
Number of
People
Backup
Vacancies
Network Design and Siting Data and Data
Management
QC activities Equipment
repair and
maintenance
QA activities Financial
Management
List available personnel by name and percentage of time spent on each task category.
Name Network
Design and
Siting
QC
Activities
QA
Activities
Equipment
repair and
maintenance
Data and
Data
Management
Financial
Management
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 7 of 48
b) Facilities
Identify the principal facilities where the agency conducts work that is related to air monitoring. Do not include monitoring
stations but do include facilities where work is performed by contractors or other organizations.
Facility AAM Function
Offices responsible for
ensuring adequacy
Location Adequate Y/N To be completed by auditor
Instrument repair,
Certification of Standards e.g.
gases, flow transfers, MFC,
PM filter weighing,
Data verification and
processing,
General office space,
Storage space, short and long
term,
Air Toxics (Carbonyls, VOCs,
Metals):
Indicate any facilities that should be upgraded. Identify by function:
Are facilities adequate concerning safety? Yes No
Please explain if answer is no:
Suggested improvements or recommendations for the items above:
Are there any significant changes which are likely to be implemented to agency facilities within the next one to two years?
Comment on agencys needs for additional physical space (laboratory, office, storage, etc.).
Facility Function Proposed Change - Date
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 8 of 48
c) Independent Quality Assurance and Quality Control
Status of Quality Assurance Program
Question Yes No Comment
Does the agency perform QA activities with internal
personnel? If no go to Section d.
Does the agency maintain a separate laboratory to
support quality assurance activities?
Has the agency documented and implemented specific
audit procedures separate from monitoring
procedures?
Are there two levels of management separation
between QA and QC operations? Please describe
below:
Does the agency have identifiable auditing equipment
and standards (specifically intended for sole use) for
audits?
Internal Performance Audits
Question Yes No Comment
Does the agency have separate facilities to support
audits and calibrations?
If the agency has in place contracts or similar agreements either with another agency or contractor to perform audits or calibrations,
please name the organization and briefly describe the type of agreement.
If the agency does not have a performance audit SOP (included as an attachment), please describe performance audit procedure for
each type of pollutant.
Does the agency maintain independence of audit
standards and personnel?
Please provide information on certification of audit standards currently being used. Include information on vendor and internal or
external certification of standards.
Does the agency have a certified source of zero air for
performance audits?
Does the agency have procedures for auditing and/or
validating performance of Meteorological monitoring?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 9 of 48
Please provide a list of the agencys audit equipment and age of audit equipment.
Is audit equipment ever used to support routine calibration and QC checks required for monitoring network operations? If yes,
please describe.
Are standard operating procedures (SOPs) for air
monitoring available to all field personnel?
Has the agency established and has it documented
criteria to define agency-acceptable audit results?
Please complete the table below with the pollutant, monitor and acceptance criteria.
Pollutant How is performance tracked (e.g., control
charts)
Audit Result Acceptance Criteria
CO
O3
NO2
SO2
PM10
PM2.5
Pb
VOCs
Carbonyls
PM2.5 speciation
PM10-2.5 speciation
PM10-2.5 FRM Mass
Continuous PM2.5
Trace Levels (CO)
Trace Levels (SO2)
Trace Levels (NO)
Trace Levels (NOy)
Surface Meteorology
Others
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 10 of 48
Question Yes No Comment
Were these audit criteria based on, or derived from, the guidance
found in Volume II of the QA Handbook for Air Pollution
Measurement System, Section 10.3?
If no, please explain.
If yes, please explain any changes or
assumptions made in the derivation.
What corrective action may be taken if criteria are exceeded? If possible, indicate two examples of corrective actions, taken within
the period since the previous systems audit which are based directly on the criteria discussed above.
Corrective Action #1
Corrective Action #2
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 11 of 48
d) Planning Documents (including QMP, QAPP, & SOPs)
QMP questions
Yes No Comment
Does the agency have an EPA-approved quality
management plan?
If yes, have changes to the plan been approved by
the EPA?
Has the QMP been approved by EPA within the
last five years?
Please provide:
Date of Original Approval: Date of Last Revision: Date of Latest Approval:
QAPP questions
Yes No Comment
Does the agency have an EPA-approved quality
assurance project plan?
If yes, have changes to the plan been approved by
the EPA?
Has the QAPP been reviewed by EPA annually?
Please provide:
Date of Original Approval: Date of Last Revision: Date of Latest Approval:
Does the agency have any revisions to your QA
project plan still pending?
How does the agency verify the QA project plan is fully implemented?
How are the updates distributed?
What personnel regularly receive updates?
SOP questions
Has the agency prepared and implemented standard
operating procedures (SOPs) for all facets of
agency operation?
Do the SOPs adequately address ANSI/ASQC E-4
quality system required by 40 CFR 58, Appendix
A?
Are copies of the SOP or pertinent sections
available to agency personnel?
How does the agency verify that the SOPs are
implemented as provided?
How are the updates distributed?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 12 of 48
e) General Documentation Policies
Question Yes No Comment
Does the agency have a documented records management plan?
Does the agency have a list of files considered official records and
their media type i.e., paper, electronic?
Does the agency have a schedule for retention and disposition of
records?
Are records for at least three years?
Who is responsible for the storage and retrieval of records?
What security measures are utilized to protect records?
Where/when does the agency rely on electronic files as primary
records?
What is the system for the storage, retrieval and backup of these
files?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 13 of 48
Indicate below the three most recent training events and identify the personnel participating in them.
Event Dates Participant(s)
f) Training
Question Yes No Comment
Does the agency have a training program and training
plan?
Where is it documented?
Does it make use of seminars, courses, EPA
sponsored college level courses?
Are personnel cross-trained for other ambient air
monitoring duties?
Are training funds specifically designated in the
annual budget?
Does the training plan include:
Training requirements by position
Frequency of training
Training for contract personnel
A list of core QA related courses
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 14 of 48
g) Oversight of Contractors and Suppliers
Questions about Contractors Yes No Comment
Who is responsible for oversight of contract personnel?
What steps are taken to ensure contract personnel meet training
and experience criteria?
How often are contracts reviewed and/or renewed?
Questions about Suppliers
Have criteria and specification been established for consumable
supplies and for equipment?
What supplies and equipment have established specifications?
Is equipment from suppliers open for bid?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 15 of 48
h) Corrective Action
Question Yes No Comment
Does the agency have a comprehensive corrective action program in place and
operational?
Have the procedures been documented?
As a part of the QA project plan?
As a separate standard operating procedure?
Does the agency have established and documented corrective limits for QA and
QC activities?
Are procedures implemented for corrective actions based on results of the
following which fall outside the established limits:
Performance evaluations?
Precision goals?
Bias goals?
NPAP audits?
PEP audits?
Validations of one point QC check goals?
Completeness goals?
Data audits?
Calibrations and zero span checks?
Technical Systems Audit findings?
Have the procedures been documented?
How is responsibility for implementing corrective actions assigned? Briefly discuss.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 16 of 48
How does the agency follow up on implemented corrective actions?
Briefly describe recent examples of the ways in which the above corrective action system was employed to remove problems.
i) Quality Improvement
Question Yes No Comment
What actions were taken to improve the quality system since the last TSA?
Since the last TSA do your control charts indicate that the overall data
quality for each pollutant steady or improving?
For areas where data quality appears to be declining has a cause been
determined?
Have all deficiencies indicted on the previous TSA been corrected?
If not explain.
Are there pending plans for quality improvement such as purchase of new
or improved equipment, standards, or instruments?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 17 of 48
j) External Performance Audits
Question Yes No Comment
Does your agency participate in NPAP, PM2.5 PEP, Pb PEP
Pb Strip Audit, AA_PGVP and other performance audits
performed by an external party and/or using external
standards?
If the agency does not participate, please explain why not.
Are NPAP audits performed by QA staff, site operators,
calibration staff, and/or another group?
National Performance Audit Program (NPAP) and Additional Audits
Does the agency participate in the National Performance Audit Program (NPAP) as required under 40 CFR 58, Appendix A? If so,
identify the individual with primary responsibility for the required participation in the National Performance Audit Program.
Name: Program Function:
Please complete the table below:
Parameter Audited Date of Last NPAP Audit
CO
O
3
SO
2
NO
2
PM
10
PM
2.5
Pb
VOCs
Carbonyls
Trace CO
Trace SO2
Trace NO
Trace NOY
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 18 of 48
2) Network Management/Field Operations
State/Local/Tribal Agency Audited:
Address:
City, State, and Zip Code:
Auditor / Agency:
Key Individuals
Ambient Air Monitoring Network Manager:
Quality Assurance Manager:
Field Operations Supervisor/Lead:
Field Operations Staff involved in the TSA:
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 19 of 48
a) Network Design
Complete the table below for each of the pollutants monitored as part of your air monitoring network. (Record applicable
count by category.) Also indicate seasonal monitoring with an S for a Parameter/Category as appropriate. Provide the
most recent annual monitoring network plan, including date of approval and AQS quick look or if not available, network
description and other similar summary of site data, including SLAMS, Other and Toxics.
Category* SO2 NO2 CO O3 PM10 PM2.5 Pb Other
(type)
Other
(type)
NCore
SLAMS
SPM
PAMS
Total
*NCore - National Core monitoring stations; SLAMS - state and local air monitoring stations; SPM - special purpose monitors;
PAMS - photochemical assessment monitoring stations
Question Yes No Comment
What is the date of the most current Monitoring Network Plan?
Is it available for public inspection?
Does it include the information required for each site?
AQS Site ID #?
Street address and geographic coordinates?
Sampling and Analysis Method(s)?
Operating Schedule?
Monitoring Objective and Scale of Representativeness?
Site suitable/not suitable for comparison to annual PM2.5 NAAQS?
MSA, CBSA or CSA indicated as required?
Indicate by AQS Site ID #any non-conformance with the requirements of 40 CFR 58, Appendices D and E along with any waivers
granted by the Regional Office (provide waiver documentation).
Monitor Site ID Reason for Non-Conformance
SO
2
O
3
CO
NO
2
PM
10
PM
2.5
Pb
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 20 of 48
Question Yes No Comment
Are hard copy site information files retained by the agency for all air monitoring
stations within the network?
Does each station have the required information including:
AQS Site ID Number?
Photographs/slides to the four cardinal compass points?
Startup and shutdown dates?
Documentation of instrumentation?
Who has custody of the current network documents Name:
Title:
Does the current level of monitoring effort, station placement, instrumentation,
etc., meet requirements imposed by current grant conditions?
How often is the network siting reviewed? Frequency:
Date of last review:
Are there any issues?
Do any sites vary from the required frequency in 40 CFR 58.12?
Does the number of collocated monitoring stations meet the requirements of 40
CFR 58 Appendix A?
b) Changes to the Network since the last audit
What is the date of the most recent network assessment? (Provide copy) Are all SLAMS parameters included? Any others?
Please provide information on any site changes since the last audit.
Pollutant Site ID Site Address Site
Added/Deleted/
Relocated
Reason (Assessment, lost lease, etc. Provide
documentation of reason for each site change.)
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 21 of 48
c) Proposed changes to the Network
Are future network changes proposed?
Please provide information on proposed site changes, including documentation of the need for the change and any required
approvals
Pollutant Site ID Site Address Site to be
Added/Deleted/
Relocated
Reason (Assessment, lost lease, etc. Provide
documentation of reason for each site change.)
d) Field Support
Question Yes No Comment
On average, how often are most of your stations visited by a field operator?
Is this visit frequency consistent for all reporting organizations within your
agency?
On average, how many stations does a single operator have responsibility for?
How many of the stations of your SLAMS/NCORE network are equipped with
sampling manifolds?
Do the sample inlets and manifolds meet the requirements for through the probe
audits?
I. Briefly describe most common manifold type.
II. Are Manifolds cleaned periodically?
How often?
III. If the manifold is cleaned, what is used to perform cleaning?
IV. Are manifold(s) equipped with a blower?
V. Is there sufficient air flow through the manifold at all times?
Approximate air flow:
VI. How is the air flow through the manifold monitored?
VII. Is there a conditioning period for the manifold after cleaning?
Length of time:
VIII. What is the residence time?
Sampling lines: What material is used for instrument sampling lines?
Are lines changed or cleaned once per year?
Do you utilize uninterruptable power supplies or backup power sources at
your sites?
What instruments or devices are protected?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 22 of 48
i) SOPs
Question Yes No Comment
Is the documentation of monitoring SOPs complete?
Are any new monitoring SOPs needed?
Are such procedures available to all field operations personnel?
Are SOPs that detail operations during episode monitoring
prepared and available to field personnel?
Are SOPs based on the framework contained in Guidance for
Preparing Standard Operating Procedures EPA QA/G-6?
Please complete the following table:
Pollutant Monitored Date of Last SOP Review Date of Last SOP Revision
SO
2
NO
2
CO
O
3
PM
10
PM
2.5
FRM mass
Pb
PM
2.5
speciation
PM
10-2.5
FRM mass
PM
10-2.5
speciation
Continuous PM
2.5
mass
Trace levels (CO)
Trace levels (SO
2
)
Trace levels (NO)
Trace levels (NO
y
)
Total reactive nitrogen
Surface Meteorology
Wind speed and direction, temperature, RH, precipitation
and solar radiation
Other parameters
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 23 of 48
ii) Instrument Acceptance
Has your agency obtained necessary waiver provisions to operate equipment which does not meet the effective reference and
equivalency requirements? List all waivers.
Please list instruments in your inventory
Pollutant Number Make and Models Reference or Equivalent
number
SO
2
NO
2
CO
O
3
PM
10
PM
2.5
Pb
Multi gas calibrator
PM
2.5
speciation
PM
10-2.5
speciation
PM
10-2.5
FRM mass
Continuous PM
2.5
mass
Trace levels (CO)
Trace levels (SO
2
)
Trace levels (NO)
Trace levels (NO
y
)
Surface Meteorology
Others
Please comment briefly and prioritize your currently identified instrument needs.
Question Yes No Comment
Are criteria established for field QC equipment?
Are criteria established for field QC gas standards?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 24 of 48
Question Yes No Comment
Are field calibration procedures included in the document?
SOPs?
Location (site, lab etc.):
Are calibrations performed in keeping with the guidance in Vol.
II of the QA Handbook for Air Pollution Measurement Systems?
If no, why not?
Are calibration procedures consistent with the operational
requirements of Appendices to 40 CFR 50 or to analyzer
operation/instruction manuals?
If no, why not?
Have changes been made to calibration methods based on
manufacturers suggestions for a particular instrument?
Do standard materials used for calibrations meet the requirements
of appendices to 40 CFR 50 (EPA reference methods) and
Appendix A to 40 CFR 58 (traceability of materials to NIST-
SRMs or CRMs)?
Comment on deviations
Are all flow-measurement devices checked and certified?
Additional comments:
Please list the authoritative standards used for each type of flow measurement, indicate the certification frequency of
standards to maintain field material/device credibility.
Flow Device Primary Standard Frequency of Certification
Hi-Volume orifice
Streamline
TriCal
BIOS
Delta Cal
Gilibrators
iii) Calibration
Please indicate the frequency of multi point calibrations.
Pollutant Frequency Name of Calibration Method
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 25 of 48
Where do field operations personnel obtain gaseous standards?
Standards are certified by:
The agency laboratory?
EPA/NERL standards laboratory?
A laboratory separate from this agencys but part of the same
reporting organization?
The vendor?
Other (describe).
How are the gas standards verified after receipt?
How are flow measurement devices certified?
Please provide copies of certifications of all standards currently
in use from your master and/or satellite standard certification
logbooks (i.e., chemical standards, ozone standards, flow
standards, and zero air standards).
What equipment is used to perform calibrations (e.g., dilution
devices) and how is the performance of this equipment verified?
Does the documentation include expiration date of
certification?
Reference to primary standard used?
What traceability is used?
Please attach an example of recent documentation of traceability
Is calibration equipment maintained at each station?
How is the functional integrity of this equipment documented?
Who has responsibility for maintaining field calibration standards?
Please list the authoritative standards and frequency of each type of dilution, permeation and ozone calibrator and indicate the
certification frequency.
Calibrator Primary Standard Frequency of Certification
Permeation calibrator flow controller
Permeation calibrator temperature
Dilution calibrator air and gas flow
controllers
Field/working standard photometer
Ozone generator
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 26 of 48
Please identify station standards for gaseous pollutants at representative air monitoring stations (attach additional sheets
as appropriate):
Parameter Station(s) Identification of Standard(s) Recertification Date(s)
CO
NO2
SO2
O3
iv) Repair
Who is responsible for performing preventive maintenance?
Is special training provided them for performing preventive maintenance? Briefly comment on background or courses.
Is this training routinely reinforced? Yes No
If no, why not?
What is your preventive maintenance schedule for each type of field instrumentation?
If preventive maintenance is MINOR, it is performed at (check one or more): field station , headquarters facilities ,
equipment is sent to manufacturer .
If preventive maintenance is MAJ OR, it is performed at (check one or more): field station , headquarters facilities ,
equipment is sent to manufacturer .
Does the agency have service contracts or agreements in place with instrument manufacturers? Indicate below or attach
additional pages to show which instrumentation is covered?
Comment briefly on the adequacy and availability of the supply of spare parts, tools and manuals available to the field operator
to perform any necessary maintenance activities. Do you feel that this is adequate to prevent any significant data loss?
Is the agency currently experiencing any recurring problem with equipment or manufacturer(s)? If so, please identify the
equipment or manufacturer, and comment on steps taken to remedy the problem.
Have you lost any data due to repairs in the last 2 years?
More than 24 hours?
More than 48 hours?
More than a week?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 27 of 48
Explain any situations where instrument down time was due to lack of preventive maintenance of unavailability of parts.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 28 of 48
v) RECORD KEEPING
Question Yes No Comment
What type of station logbooks are maintained at each monitoring
station? (maintenance logs, calibration logs, personal logs, etc.)
What information is included in the station logbooks?
Who reviews and verifies the logbooks for adequacy of station
performance?
How is control of logbook maintained?
Where is the completed logbook archived?
What other records are used?
Zero span record?
Gas usage log?
Maintenance log?
Log of precision checks?
Control charts?
A record of audits?
Please describe the use and storage of these documents.
Are calibration records or at least calibration constants available to field
operators?
Please attach an example field calibration record sheet to this questionnaire.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 29 of 48
vi) Site Information and monitor Information
PQAO:
AQS Site Name:
AQS Site Number:
Agency Site Name/No.:
(if different than AQS Site
Name/Number)
Site Address:
City & County:
Site Coordinates:
(specify lat/long or UTM)
Site Elevation (m):
Criteria Pollutants Monitored:
Other Parameters:
Nearest Meteorological Site:
(on site is met tower present at this site)
Photographs to and from each cardinal direction attached?
(Yes or No)
Name(s) of Report Preparer(s):
Name(s) of Auditors:
Date:
Phone Number:
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 30 of 48
Site Map
Draw map of site and surrounding terrain and features, up to 100 meters.
Map notes
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 31 of 48
Monitor Information
Pollutants
Manufacturer
Model
Serial number
Scale of representation Micro, Middle,
Neighborhood, Urban
Objective (Population, Max concentration,
Background, Transport)
Height of probe above ground(m)
Distance from obstruction (m)
Type of obstruction (Wall, Tree, etc)
Distance from roadway (m)
Unrestricted airflow (Yes, No)
Designation (NCore, SLAMS, etc)
Siting Criteria Met (Yes, No)
Pollutants
Manufacturer
Model
Serial number
Scale of representation Micro, Middle,
Neighborhood, Urban
Averaging time 1-, 8-, 24-hour
Objective (Population, Max concentration,
Background, Transport)
Height of probe above ground(m)
Distance from obstruction (m)
Type of obstruction (Wall, Tree, etc)
Distance from roadway (m)
Unrestricted airflow (Yes, No)
Designation (NCore, SLAMS, etc)
Siting Criteria Met (Yes, No)
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 32 of 48
Insert additional copies of table as needed:
Area Information
Street Name Traffic Count (Vehicles/day)
Direction Predominant Land Use (Industry, Residential, Commercial or Agriculture)
North
East
South
West
Direction Obstructions Height (m) Distance (m)
North
East
South
West
Note: This table is for large obstructions that affect the entire site, such as large clusters of trees or entire buildings.
Individual obstructions, such as walls, single trees, other monitors, etc, should be entered in the Monitor Information table.
Direction Topographic Features (hills, valleys, rivers,
etc.)
General Terrain (flat, rolling, rough)
North
East
South
West
Comments:
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 33 of 48
3) Laboratory Operations
State/Local/Tribal Agency Audited:
City, State, and Zip Code:
Date of Technical System Audit:
Auditor / Agency:
Key Individuals
Laboratory Manager:
Laboratory Supervisor:
Quality Assurance Manager:
Laboratory Staff involved in the TSA:
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 34 of 48
a) Routine Operations
What analytical methods are employed in support of your air monitoring network?
Analysis Name or Description of Method
PM10
PM2.5
Pb
Others (list by pollutant)
1. Please describe areas where there have been difficulties meeting the regulatory requirements for any of the above analytical
methods.
In the table below, please identify the current versions of written methods, supplements, and guidelines that are used in your agency.
Analysis Documentation of Method
PM10
PM2.5
Pb
Others (list by pollutant)
Question Yes No Comment
Were procedures for the methods listed above included in the
agencys QAAP or SOPs and were they reviewed by EPA? Also,
are SOPs easily/readily accessible for use and reference?
Does you lab have sufficient instrumentation to conduct analyses?
Please describe needs for laboratory instrumentation
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 35 of 48
b) Laboratory Quality Control
Please identify laboratory standards used in support of the air monitoring program, including standards which may be kept
in an analytical laboratory and standards which may be kept in a field support area or quality assurance laboratory that is
dedicated to the air monitoring program (attach additional sheets if appropriate):
Parameter Location of
Standards
Laboratory
Standard
Recertification Date Primary Standard*
CO
NO2
SO2
O3
Weights
Temperature
Moisture
Barometric Pressure
Flow
Other Flow Standard
Lead
Other
*Standards to which the laboratory standards can be traced.
Question Yes No Comment
Are all chemicals and solutions clearly marked with an
indication of shelf life?
Are chemicals removed and properly disposed of when
shelf life expires?
Are only ACS grade chemicals used by the laboratory?
Comment on the traceability of chemicals used in the preparation of calibration standards.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 36 of 48
Question Yes No Comment
Does the laboratory purchase standard solutions such as
those for use with lead or other metals analysis?
Are all calibration procedures documented?
If answer yes to (f), please describe the following:
Title of the document:
Revision number:
Where the document is:
Are at least one duplicate, one blank, and one standard or
spike included with a given analytical batch?
Briefly describe the laboratorys use of data derived from blank analyses.
Question Yes No Comment
Are criteria established to determine whether a blank
data are acceptable?
How frequently and at what concentration ranges does the lab perform duplicate analysis? What constitutes an acceptable
agreement? Please comment in the space below.
Please describe how the lab use data obtained from spiked samples, including the acceptance criteria (e.g., acceptable percent
recovery).
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 37 of 48
Question Yes No Comment
Does the laboratory routinely include samples of
reference material within an analytical batch?
If yes, indicate frequency, level, and material used.
Are mid-range standards included in analytical batches?
Please describe the frequency, level and compound used in the space provided below.
Are criteria for real time quality control established that
are based on the results obtained for the mid-range
standards discussed above?
If yes, briefly discuss them below or indicate the document in which they can be found.
Are appropriate acceptance criteria for each type of
analysis documented?
c) Laboratory Preventive Maintenance
Question Yes No Comment
For laboratory equipment, who has the responsibility for performing preventive maintenance?
Is most maintenance performed in the lab?
Is a maintenance log maintained for each major
laboratory instrument?
Are service contracts in place for major analytical
instruments?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 38 of 48
d) Laboratory Record Keeping
Question Yes No Comment
Are all samples that are received by the laboratory logged
in?
Discuss sample routing and special needs for analysis (or attach a copy of the latest SOP which covers this). Attach a flow chart if
possible.
Are log books kept for all analytical laboratory
instruments?
Are there log books or other records that indicate the
checks made on materials and instruments such as
weights, humidity indicators, balances, and thermometers?
Identify type of record, acceptable/non-acceptable.
Are log books maintained to track the preparation of filters
for the field?
Are they current?
Do they indicate proper use of conditioning?
Weightings?
Stamping and numbering?
Are log books kept which track filters returning from the
field for analysis?
How are data records from the laboratory archived?
Where?
Who has the responsibility?
Title:
How long are records kept? Years
Does a chain-of-custody procedure exist for laboratory
samples?
If yes, indicate date, title and revision number where it can be found.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 39 of 48
e) Laboratory Data Acquisition and Handling
Question Yes No Comment
Identify those laboratory instruments which make use of computer interfaces directly to record data. Which ones use strip charts?
Integrators?
Are QC data readily available to the analyst during a
given analytical run?
What is the laboratorys capability with regard to data recovery? In case of problems, can they recapture data or are they dependent
on computer operations? Discuss briefly.
Has a users manual been prepared for the automated data
acquisition instrumentation?
Please provide below a data flow diagram which establishes, by a short summary flow chart: transcriptions, validations, and
reporting format changes the data goes through before being released by the laboratory.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 40 of 48
f) Specific Pollutants: PM
10
, PM
2.5
and Lead
Question Yes No Comment
PM10 and PM2.5
Does the agency use filters supplied by EPA?
Do filters meet the specifications in 40 CFR 50?
Are filters visually inspected via strong light from a view box
for pinholes and other imperfections?
Where does the laboratory keep records of the serial numbers
of filters?
Are unexposed filters equilibrated in controlled conditioning environment which meets or exceeds the requirements of 40 CFR
50?
Are the temperature and relative humidity of the conditioning
environment monitored?
Are the temperature and humidity monitors calibrated?
Are balances checked with Class S or Class M weights each
day when they are used?
Is the balance check information placed in QC log book?
To what sensitivity are filter weights recorded?
Are filter serial numbers and tare weights recorded in a
bound notebook?
Are filters packaged for protection while transporting to and
from the monitoring stations?
How often are filter samples collected? (Indicate the average
elapsed time in hours between end of sampling and labora-
tory receipt.)
In what medium are field measurements recorded (e.g., in a log book, on a filter folder, or on standard forms)?
Are exposed filters reconditioned for at least 24 hrs in the same conditioning environment as for unexposed filters?
Briefly describe how exposed filters are prepared for conditioning.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 41 of 48
Briefly describe how exposed filters are stored after being weighed.
Are blank filters reweighed? How often?
Are chemical analyses performed on filters?
LEAD
Is analysis for lead being conducted using atomic absorption
spectrometry with air acetylene flame?
If not, has the agency received an equivalency
designation of their procedure?
Is either the hot acid or ultrasonic extraction procedure being
followed precisely?
Which?
Is Class A borosilicate glassware used throughout the
analysis?
Is all glassware cleaned with detergent, soaked and rinsed
three times with distilled or de-ionized water?
If extracted samples are stored, are linear polyethylene
bottles used?
Are all batches of glass fiber filters tested for background
lead content?
At a rate of 20 to 30 random filters per batch of 500 or
greater?
Indicate rate.
Are ACS reagent grade HNO3 and HCl used in the analysis?
Is a calibration curve available having concentrations that
cover the linear absorption range of the atomic absorption
instrumentation?
Is the stability of the calibration curve checked by alternately
re-measuring every 10th sample a concentration of <=1ug
Pb/ml; <=10 ug Pb/ml?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 42 of 48
4) DATA AND DATA MANAGEMENT
State/Local/Tribal Agency Audited:
City, State, and Zip Code:
Date of Technical System Audit:
Auditor / Agency:
Key Individuals
Data Manager:
Data Supervisor:
Quality Assurance Manager:
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 43 of 48
a) Data Handling
Question Yes No Comment
Is there a procedure, description, or a chart which shows a complete
data sequence from point of acquisition to point of submission of data
to EPA?
Please provide below a data flow diagram indicating both the data flow within the reporting organization.
Are procedures for data handling (e.g., data reduction, review, etc.)
documented?
In what media (e.g., diskette, data cartridge, or telemetry) and formats do data arrive at the data processing location? Please list
below.
Category of Data (by Pollutant) Data Media and Formats
How often are data received at the processing location from the field sites and laboratory?
Is there documentation accompanying the data regarding any media
changes, transcriptions, or flags which have been placed into the data
before data are released to agency internal data processing?
Describe the type of documentation.
How data are actually entered to the computer system (e.g., computerized transcription (copy from disk or data transfer device),
manual entry, digitization of strip charts, or other)?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 44 of 48
b) Software Documentation
Question Yes No Comment
Does your agency use any AQS Manual?
Does your agency use any Air Now Manual?
If yes, list the title of manual used including the, version number and date published.
Does the agency have information on the reporting of precision and
accuracy data available (i.e. AMP 255)?
What are the origins of the software used to prepare air monitoring data for release into the AQS and Air Now database? Please list
the documentation for the software currently in use for data processing, including the names of the software packages, vendor or
author, revision numbers, and the revision dates of the software.
What is the recovery capability in the event of a significant computer problem (i.e., how much time and data would be lost)?
Has your agency tested the data processing software to ensure its
performance of the intended function is consistent with the QA
Handbook, Volume II, and Section 14.0?
Does your agency document software tests?
If yes, provide the documentation.
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 45 of 48
c) Data Validation and Correction
Question Yes No Comment
Have your agency established and document the
validation criteria?
If yes, indicate document where such criteria can be
found (title, revision date).
Does documentation exist on the identification and
applicability of flags (i.e., identification of suspect
values) within the data as recorded with the data in the
computer files?
Does your agency document the data validation criteria
including limits for values such as flow rates, calibration
results, or range tests for ambient measurements?
If yes, please describe what action the data validation will take if he/she fined data with limits exceeded (e.g., flags, modifies, or
delete, etc.)
If yes, give examples to illustrate actions taken when limits were exceeded.
Please describe how changes made to data that were submitted to AQS and Air Now are documented.
Who has signature authority for approving corrections?
Name: Program Function:
What criteria are used to determine a data point is deleted? Discuss briefly.
What criteria are used to determine if data need to be reprocessed? Discuss.
Are corrected data resubmitted to the issuing group for
cross-checking prior to release?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 46 of 48
d) Data Processing
Question Yes No Comment
Does the agency generate data summary reports?
Please list at least three reports routinely generated, including the information requested below.
Report Title Distribution Period Covered
Question Yes No Comment
How often are data submitted to AQS and Air Now?
Briefly comment on difficulties the agency may have encountered in coding and submitting data following the guidance of the AQS
guidelines?
Does the agency routinely request a hard copy printout on
submitted data from AQS?
Are records kept for at least 3 years by the agency in an
orderly, accessible form?
If yes, does this include raw data , calculation , QC data , And reports ?
If no, please comment.
Has your agency submitted data along with the
appropriate calibration equations used to the processing
center?
Are concentrations of pollutants (other than PM2.5)
corrected to EPA standard temperature and pressure
conditions (i.e., 298 K, 760 mm Hg) before input to
AQS, and concentrations of PM2.5 reported to AQS
under actual (volumetric) conditions?
I) Are audits on data reduction procedure performed on a
routine basis?
If yes, at what frequency?
Are data precision and accuracy checked each time they
are calculated, recorded, or transcribed to ensure that
incorrect values are not submitted to EPA?
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 47 of 48
e) Internal Reporting
What internal reports are prepared and submitted as a result of the audits required under 40 CFR 58, Appendix A?
Report Title Frequency
What internal reports are prepared and submitted as a result of precision checks also required under 40 CFR 58, Appendix A?
Report Title Frequency
Question Yes No Comment
Do either the audit or precision check reports indicated
include a discussion of corrective actions initiated based
on audit or precision check results?
Who has the responsibility for the calculation and preparation of data summaries? To whom are such summaries delivered?
Name Title Type of Report Recipient
QA Handbook Volume II, Appendix H
Revision No: 1.0
Date: 05/13
Page 48 of 48
f) External Reporting
For the current calendar year or portion thereof which ended at least 90 calendar days prior to the receipt of this
questionnaire, please provide the following percentages for required data submitted on time.
Percent Submitted on Time* Period Covered:
Monitoring Qtr.
SO2 CO O3 NO2 PM10 PM2.5 Pb
1 (J an 1 - March 31)
2 (Apr 1 - J une 30)
3 (J uly 1 - Sept. 30)
4 (Oct.1 - Dec. 31)
*"On time" = within 90 calendar days after the end of the quarter in which the data were collected.
For the same period, what fraction of the stations (by pollutant) reported less than 75% of the data (adjusted for seasonal
monitoring and site start-ups and terminations)?
Percent of Stations <75% Data Recovery Period Covered:
Monitoring Qtr.
SO2 CO O3 NO2 PM10 PM2.5 Pb
1 (J an 1 - March 31)
2 (Apr 1 - J une 30)
3 (J uly 1 - Sept. 30)
4 (Oct.1 - Dec. 31)
Identify the individual within the agency with the responsibility for reviewing and releasing the data.
Name: Program Function:
Question Yes No Comment
Does your agency report the Air Quality Index?
Has your agency submitted its annual data summary report as required in
40 CFR 58.15(b)?
If yes, did your agencys annual report include the following:
Annual precision and accuracy information (i.e. AMP 255) described in
40 CFR 58.15 (c)?
Location, date, pollution source and duration of all episodes reaching
the significant harm levels?
Is Data Certification signed by a senior officer of your agency?
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 1 of 25
Appendix I
Examples of Reports to Management
The following example of an annual quality assurance report consist of a number of sections that
describe the quality objectives for selected sets of measurement data and how those objectives
have been met. Sections include:
Executive Summary,
Introduction, and
Quality information for each ambient air pollutant monitoring program.
The report is titled "Acme Reporting Organization, Annual Quality Assurance Report for 2000".
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 2 of 25
ACME REPORTING ORGANIZATION
ANNUAL QUALITY ASSURANCE REPORT FOR 2000
Prepared by
Quality Assurance Department
Acme Reporting Organization
110 Generic Office Building
Townone XX, 00001
April 2001
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 3 of 25
ACME REPORTING ORGANIZATION
ANNUAL QUALITY ASSURANCE REPORT FOR 2000
TABLE OF CONTENTS
EXECUTIVE SUMMARY
INTRODUCTION
Data quality
Quality assurance procedures
GASEOUS CRITERIA POLLUTANTS
Program update
Quality objectives for measurement data
Data quality assessment
PARTICULATE CRITERIA POLLUTANTS
Program update
Quality objectives for measurement data
Data quality assessment
TOTAL AND SPECIATED VOLATILE ORGANIC COMPOUNDS
Program update
Quality objectives for measurement data
Data quality assessment
AIR TOXIC COMPOUNDS
Program update
Quality objectives for measurement data
Data quality assessment
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 4 of 25
ACME REPORTING ORGANIZATION
ANNUAL QUALITY ASSURANCE REPORT FOR 2000
EXECUTIVE SUMMARY
This summary describes the Acme Reporting Organization's (ARO's) success in meeting its quality
objectives for ambient air pollution monitoring data. ARO's attainment of quantitative objectives, such as
promptness, completeness, precision, and bias, are shown in Table 1, below. ARO met these objectives
for all pollutants, with the exception of nitrogen dioxide. The failure to meet completeness and timeliness
goals for nitrogen dioxide was due to the breakdown of several older analyzers. Replacement parts were
installed and the analyzers are now providing data that meet ARO's quality objectives.
Table 1. Attainment of Quantitative Quality Objectives for Ambient Air Monitoring Data
Measurement
Program met objectives for
Promptness Completeness Precision Bias
Air Toxics Yes Yes Yes Yes
Carbon Monoxide Yes Yes Yes Yes
Lead Yes Yes Yes Yes
Nitrogen Dioxide No No Yes Yes
Ozone Yes Yes Yes Yes
Sulfur Dioxide Yes Yes Yes Yes
PM
10
Yes Yes Yes Yes
PM
2.5
Yes Yes Yes Yes
Volatile Organic
Compounds (VOCs)
Yes Yes Yes Yes
Other quality objectives (for example those concerning siting, recordkeeping, etc.) were assessed via
laboratory and field system audits. The results of these audits indicate compliance with ARO's standard
operating procedures except for the following:
The Towntwo site was shadowed by a 20 story office building which was recently completed.
This site was closed in J uly 2000.
The Townfour site had problems with vandalism. A new, more secure, fence was installed in
April and the sheriff's department increased patrols in the area to prevent reoccurrences.
Newly acquired laboratory analytical instruments did not have maintenance logs. New logs were
obtained and personnel were instructed on their use. A spot check, approximately one month
later, indicated the new logs were in use.
A review of equipment inventories identified three older sulfur dioxide ambient air monitors that, based
on our past experience, are likely to experience problems. Cost information and a schedule for
replacement has been prepared and submitted to management for funding. Based on this schedule, the
new monitors will be installed before the end of 2001.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 5 of 25
INTRODUCTION
The Acme Reporting Organization (ARO) conducts ambient air monitoring programs for the State Bureau
of Environmental Quality and local air quality management districts. These programs involve:
monitoring of criteria pollutants to determine the National Ambient Air Quality Standards
(NAAQS) attainment status of state and local air quality. This monitoring is conducted as part of
the State and Local Air Monitoring Stations (SLAMS) and National Air Monitoring Stations
(NAMS) networks.
monitoring compounds (volatile organic compounds and nitrogen oxides), referred to as ozone
precursors, that can produce the criteria pollutant ozone. This monitoring is conducted as part of
the Photochemical Assessment Monitoring Stations (PAMS) network.
monitoring toxic air pollutants.
The purpose of this report is to summarize the results of quality assurance activities performed by ARO to
ensure that the data meets its quality objectives. This report is organized by ambient air pollutant
category (e.g., gaseous criteria pollutants, air toxics). The following are discussed for each pollutant
category:
program overview and update
quality objectives for measurement data
data quality assessment
DATA QUALITY
Data quality is related to the need of users for data of sufficient quality for decision making. Each user
specifies their needed data quality in the form of their data quality objectives (DQOs). Quality objectives
for measurement data are designed to ensure that the end user's DQOs are met. Measurement quality
objectives are concerned with both with quantitative objectives (such as representativeness, completeness,
promptness, accuracy, precision and detection level) and qualitative objectives (such as site placement,
operator training, and sample handling techniques).
QUALITY ASSURANCE PROCEDURES
Quality assurance is a general term for the procedures used to ensure that a particular measurement meets
the quality requirements for its intended use. In addition to performing tests to determine bias and
precision, additional quality indicators (such as sensitivity, representativeness, completeness, timeliness,
documentation quality, and sample custody control) are also evaluated. Quality assurance procedures fall
under two categories:
quality control - procedures built into the daily sampling and analysis methodologies to ensure
data quality, and
quality assessment - which refers to periodic outside evaluations of data quality.
Some ambient air monitoring is performed by automated equipment located at field sites, while other
measurements are made by taking samples from the field to the laboratory for analysis. For this reason,
we will divide quality assurance procedures into two parts field and laboratory quality assurance.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 6 of 25
Field Quality Assurance
Quality control of automated analyzers and samplers consists of calibration and precision checks. The
overall precision of sampling methods is measured using collocated samplers. Quality assurance is
evaluated by periodic performance and system audits.
Calibration - Automated analyzers (except ozone) are calibrated by comparing the instrument's response
when sampling a cylinder gas standard mixture to the cylinder's known concentration level. The analyzer
is then adjusted to produce the correct response. Ozone analyzers are calibrated by on-site generation of
ozone whose concentration is determined by a separate analyzer which has its calibration traceable to the
U.S. Environmental Protection Agency. The site's analyzer is then adjusted to produce the same measured
concentration as the traceable analyzer. Manual samplers are calibrated by comparing their volumetric
flow rate at one or more flow rates to the flow measured by a flow rate transfer standard. Calibrations are
performed when an instrument is first installed and at semi-annual intervals thereafter. Calibrations are
also performed after instrument repairs or when quality control charts indicate a drift in response to
quality control check standards.
Precision - Precision is a measure of the variability of an instrument. The precision of automated
analyzers is evaluated by comparing the sample's known concentration against the instrument's response.
The precision of manual samplers is determined by collocated sampling the simultaneous operation of
two identical samplers placed side by side. The difference in the results of the two samplers is used to
estimate the precision of the entire measurement process (i.e., both field and laboratory precision).
Performance Audits - The bias of automated methods is assessed through field performance audits.
Performance audits are conducted by sampling a blind sample (i.e., a sample whose concentration is
known, but not to the operator). Bias is evaluated by comparing the measured response to the known
value. Typically, performance audits are performed annually using blind samples of several different
concentrations.
System Audits - System audits indicate how well a sampling site conforms to the standard operating
procedures as well as how well the site is located with respect to its mission (e.g., urban or rural sampling,
special purpose sampling site, etc.). System audits involve sending a trained observer (QA Auditor) to the
site to review the site compliance with standard operating procedures. Some areas reviewed include: site
location (possible obstruction, presence of nearby pollutant sources), site security, site characteristics
(urban versus suburban or rural), site maintenance, physical facilities (maintenance, type and operational
quality of equipment, buildings, etc.), recordkeeping, sample handling, storage and transport.
Laboratory Quality Assurance
Laboratory quality control includes calibration of analytical instrumentation, analysis of blank samples to
check for contamination, and analysis of duplicate samples to evaluate precision. Quality assurance is
accomplished through laboratory performance and system audits.
Calibration - Laboratory analytical instruments are calibrated by comparing the instrument's response
when sampling standards of known concentration level. The difference between the measured and known
concentrations is then used to adjust the instrument to produce the correct response.
Blank Analysis - A blank sample is one that has intentionally not been exposed to the pollutant of interest.
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 7 of 25
Analysis of blank samples reveals possible contamination in the laboratory or during field handling or
transportation.
Duplicate Analysis - Duplicate analyses of the same sample are performed to monitor the precision of the
analytical method.
Performance Audits - Regular performance audits are conducted by having the laboratory analyze
samples whose physical or chemical properties have been certified by an external laboratory or standards
organization. The difference between the laboratory's reported value and the certified values is used to
evaluate the analytical method's accuracy.
System Audits - System audits indicate how well the laboratory conforms to its standard operating
procedures. System audits involve sending a trained observer (QA Auditor) to the laboratory to review
compliance with standard operating conditions. Areas examined include: record keeping, sample
custody, equipment maintenance, personnel training and qualifications, and a general review of facilities
and equipment.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 8 of 25
GASEOUS CRITERIA POLLUTANTS
The Acme Reporting Organization monitors the ambient concentrations of the gaseous criteria pollutants
carbon monoxide (CO), nitrogen dioxide (NO
2
), ozone (O
3
), and sulfur dioxide (SO
2
) to determine
attainment of Federal (NAAQS) and State ambient air quality standards. Monitoring of these pollutants is
conducted continuously by a network of automated stations.
PROGRAM UPDATE
At the beginning of 2000, the Acme Reporting Organization operated 38 ambient air monitoring stations
that measured gaseous criteria pollutants. On March 1, 2000, a station was opened at Townone to monitor
CO, NO
2
, O
3
, and SO
2
. The station at Towntwo, which monitored NO
2
, O
3
, and SO
2
, was closed in April
2000.
QUALITY OBJECTIVES FOR MEASUREMENT DATA
The Quality Objectives for the Acme Reporting Organization's ambient air monitoring of gaseous criteria
pollutants are shown in Table 2, below.
Table 2. Quality Objectives for Gaseous Criteria
Pollutants
Data Quality Indicator Objective
Precision 10%
Bias 15%
Completeness 75%
Promptness 100%
DATA QUALITY ASSESSMENT
Summary
Assessment of the data quality for ARO gaseous criteria pollutants showed that all instruments met goals
for accuracy, precision, completeness, and promptness. System audits showed siting problems at three
sites, two of these were corrected promptly, while the third site had to be closed due to the construction of
a nearby large office building.
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 9 of 25
Promptness and Completeness
At least 75 percent of scheduled monitoring data must be reported for purposes of determining attainment
of NAAQS. All data must be submitted within 90 days after the end of the reporting quarter. Table 3
summarizes promptness and completeness for gaseous criteria pollutant data.
Table 3. Data Quality Assessment for Promptness
and Completeness
Pollutant Promptness Completeness
Carbon monoxide 100% 95%
Nitrogen dioxide 100% 97%
Ozone 100% 94%
Sulfur dioxide 100% 96%
Precision
At least once every two weeks, precision is determined by sampling a gas of known concentration. Table
4 summarizes the precision checks for gaseous criteria pollutants.
Table 4. Data Quality Assessment for Precision
Pollutant
Precision checks
completed
Percentage within
limits
Carbon monoxide (CO) 98% 98%
Nitrogen dioxide (NO
2
) 100% 97%
Ozone (O
3
) 97% 98%
Sulfur dioxide (SO
2
) 100% 98%
Bias
The results of annual performance audits conducted by ARO personnel are shown in Figure 1, below. The
center line for each pollutant represents the average bias across all analyzers (i.e., with all analyzers
weighted equally). The lower and upper probability limits represent the boundaries within which 95
percent of the individual bias values are expected to be distributed.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 10 of 25
Figure 2 shows the results of external performance audits performed with the National Performance Audit
Program (NPAP), administered by the U.S. EPA.
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 11 of 25
System Audits
Systems audits were performed at approximately 25 percent of the sites during the calendar year
2000. These audits evaluated areas such as siting criteria, analyzer operation and maintenance,
operator training, recordkeeping, and serve as a general review of site operations. No significant
problems were observed, except for the following:
The Towntwo site was shadowed by a 20 story office building which was recently
completed. This site was closed in J uly 2000.
The Townfour site had problems with repeated vandalism. A new, more secure, fence
was installed in April and the sheriff's department increased patrols in the area to prevent
reoccurrences.
The Townsix site had vegetation which had grown too close to the analyzer inlet probes.
The vegetation was removed within one week after the problem was reported. Personnel
from the County Parks and Recreation Department provided assistance removing the
vegitation.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 12 of 25
PARTICULATE CRITERIA POLLUTANTS
The Acme Reporting Organization monitors the ambient concentrations of three particulate criteria
pollutants:
Lead;
PM
10
(particles with an aerodynamic diameter less than or equal to a nominal 10 micrometers;
and
PM
2.5
(particles with an aerodynamic diameter less than or equal to a nominal 2.5 micrometers)
This monitoring is used to determine attainment of Federal (NAAQS) and State ambient air quality
standards. Monitoring of these pollutants is conducted by sampling for 24 hours every six days by a
network of manually operated samplers.
PROGRAM UPDATE
At the beginning of 2000, the Acme Reporting Organization operated 22 ambient air monitoring stations
that measured particulate criteria pollutants. On March 1, 2000, a station was opened at Townone to
monitor PM
10
, PM
2.5
, and lead. The station at Towntwo, which monitored PM
10
, PM
2.5
, and lead, was
closed in April 2000.
QUALITY OBJECTIVES FOR MEASUREMENT DATA
The Quality Objectives for the Acme Reporting Organization's ambient air monitoring of particulate
criteria pollutants are shown in Table 5, below.
Table 5. Quality Objectives for Particulate Criteria
Pollutants
Data Quality Indicator Objective
Precision 7%
Bias 10%
Completeness 75%
Promptness 100%
DATA QUALITY ASSESSMENT
Summary
Assessment of the data quality for ARO particulate criteria pollutants showed that all samplers
met goals for accuracy, precision, completeness, and promptness. System audits showed siting
problems at three sites. Two of these were corrected promptly, while the third site had to be
closed due to the construction of a large office building, nearby.
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 13 of 25
Promptness and Completeness
At least 75 percent of scheduled monitoring data must be reported for purposes of determining attainment
of NAAQS. All data must be submitted within 90 days after the end of the reporting quarter. Table 6
summarizes promptness and completeness data for particulate criteria pollutants.
Table 6. Data Quality Assessment for Promptness and
Completeness
Pollutant Promptness Completeness
Lead 100% 93%
PM
10
100% 95%
PM
2.5
100% 92%
Precision
Precision is determined by operating collocated samplers (i.e., two identical samplers operated in the
identical manner). Due to the anticipated poor precision for very low levels of pollutants, only collocated
measurements above a minimum level (0.15 g/m
3
for lead, 20 g/m
3
for PM
10
, and 6 g/m
3
for PM
2.5
)
are used to evaluate precision. Table 7 summarizes the results of collocated measurements made during
the calendar year 2000.
Table 7. Data Quality Assessment for Precision
Pollutant
Collocated precision
measurements completed
Collocated
measurements within
limits
Lead 98% 98%
PM
10
100% 97%
PM
2.5
97% 98%
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 14 of 25
Flow rate precision
A flow rate precision check is conducted at least every two weeks for PM
10
and PM
2.5
samplers. The flow
should be within 10% of the specified value. Results are shown in Table 8.
Table 8. Flow Rate Precision Checks for Particulate Criteria Pollutants
Pollutant
Precision Checks
completed
Precision Checks
within limits
Lead 98% 98%
PM
10
100% 97%
PM
2.5
97% 98%
Flow rate bias
Results of the annual flow rate audits conducted by ARO personnel are shown in Figure 3, below. The
center line for each pollutant represents the average bias across all sampler (i.e., with all sampler
weighted equally). The lower and upper probability limits represent the boundaries within which 95
percent of the individual bias values are expected to be distributed.
Figure 4 shows the results of external flow rate audits for PM
10
and lead samplers performed with the
National Performance Audit Program (NPAP) which is administered by the U.S. EPA. Currently NPAP
audits of PM
2.5
samplers involve sampler collocation rather than flow rate checks
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 15 of 25
Measurement Bias
Measurement bias is evaluated for PM
2.5
analyzers by collocated sampling using an audit sampler. For
internal audits, the collocated measurements provide an estimate of bias resulting from sampler
operations. For external NPAP audits, the collocated measurements provide an estimate of bias resulting
from both sampler and laboratory operations. Measurement bias for lead is evaluated by use of standard
lead test samples. This provides an estimate of the bias resulting from laboratory operations. The results
of the annual performance audits of PM
2.5
and lead conducted by ARO personnel are shown in Figure 5,
below.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 16 of 25
Figure 6 shows the results of external performance audits for PM
10
and lead performed with the National
Performance Audit Program (NPAP) which is administered by the U.S. EPA.
System Audits
Systems audits were performed at approximately one fourth of the sites and at the central analytical
laboratory during calendar year 2000. These audits evaluated areas such as siting criteria, equipment
operation and maintenance, operator training, recordkeeping, and served as a general review of site
operations. No significant problems were observed, except for the following:
The Towntwo site was shadowed by a 20 story office building which was recently completed.
This site was closed in J uly 2000.
The Townfour site had problems with repeated vandalism. A new, more secure, fence was
installed in April and the sheriff's department increased patrols in the area to prevent
reoccurrences.
No significant problems were found in the laboratory audits, except for failure to keep maintenance logs
on several newly acquired analytical instruments. New logs were obtained and personnel instructed on
their use. A spot check, approximately one month later, indicated the logs were in use.
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 17 of 25
TOTAL AND SPECIATED VOLATILE ORGANIC COMPOUNDS (PAMS)
The Acme Reporting Organization monitors the ambient concentrations of ozone precursors (volatile
organic compounds [VOCs], carbonyls, and nitrogen oxides that can produce the criteria pollutant ozone).
This monitoring is conducted as part of the Photochemical Assessment Monitoring Stations (PAMS)
network. Nitrogen dioxide (one of the nitrogen oxides measured in PAMS) is also a criteria pollutant and
its measurement is described under the gaseous criteria pollutant section, above. Total nitrogen oxides
(NO
x
) measurements are obtained continuously by a network of automated stations. Volatile organic
compounds (VOCs), excluding carbonyls, are measured by continuous analyzers (on-line gas
chromatographs) at selected sites. The remaining sites use automated samplers to collect VOC canister
samplers which are then transported to the laboratory for analysis. Carbonyls are collected in adsorbent
sampling tubes, which are transported to the laboratory for analysis.
PROGRAM UPDATE
At the beginning of 2000, the Acme Reporting Organization operated 5 ambient air monitoring stations
that measured ozone precursors. On March 1, 2000, a station was opened at Townone to monitor VOCs,
carbonyls, and NO
x
.
QUALITY OBJECTIVES FOR MEASUREMENT DATA
The Quality Objectives for the Acme Reporting Organization's ambient air monitoring of ozone
precursors are shown in Table 9, below.
Table 9. Quality Objectives for Ozone Precursors
Data Quality Indicator Objective
Precision (NO
x
) 10%
Precision (VOC, Carbonyls) 25%
Bias (NO
x
) 15%
Bias (VOC, Carbonyls) 20%
Completeness 75%
Promptness 100%
DATA QUALITY ASSESSMENT
Summary
Assessment of the data quality for ozone precursors showed that all instruments met goals for accuracy,
precision, completeness, and promptness. System audits showed siting problems at two sites, both of
these were corrected promptly.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 18 of 25
Promptness and Completeness
At least 75 percent of scheduled monitoring data must be reported. All data must be submitted within six
months after the end of the reporting quarter. Table 10 summarizes promptness and completeness data
for ozone precursors.
Table 10. Data Quality Assessment for Promptness and Completeness
Ozone precursor Promptness Completeness
Carbonyls 100% 80%
Nitrogen Oxides (NO
x
) 100% 96%
Total VOCs (Total non-
methane hydrocarbons)
100% 87%
Speciated VOCs 100% 83%
Precision
At least once every two weeks, precision for nitrogen oxides (NO
x
) and automated VOC analysis were
determined by sampling a gas of known concentration. Precision for manual VOC sampling and carbonyl
sampling is obtained by analysis of duplicate samples. Duplicates are taken at a frequency of one
duplicate for every 10 samples. Table 11 summarizes the precision check results for 2000.
Table 11. Data Quality Assessment for Precision
Ozone precursor
Precision checks
completed
Precision checks
within limits
Carbonyls 91% 90%
Nitrogen Oxides (NO
x
) 98% 97%
Total VOCs (Total non-
methane hydrocarbons)
90% 91%
Speciated VOCs 95% 80%
Bias
The results of the annual performance audits conducted by ARO personnel are shown in
Figure 7, below. For NO
x
and the automated VOC analyzers, the center line represents the
average bias across all sites (i.e., with all sites weighted equally). For the carbonyl and manual
VOC analyses, the center line represents the average of all audit samples for the central
analytical laboratory. The lower and upper probability limits represent the boundaries within
which 95 percent of the individual bias values are expected to be distributed. Carbonyl and Total VOC
measurements represent the average of all audit species.
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 19 of 25
Figure 8 shows the results of the external performance audits performed with the National Performance
Audit Program (NPAP) which is administered by the U.S. EPA.
System Audits
Systems audits were performed at two sites during calendar year 2000. These audits evaluated
areas such as siting criteria, analyzer and sampler operation and maintenance, operator training,
recordkeeping, and serve as a general review of site operations. In general both sites were
performing well except for the following:
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 20 of 25
The Townsix site had vegetation which had grown too close to the analyzer inlet probes. The
vegetation was removed within one week, with assistance from the County Parks and Recreation
Department.
A systems audit was also performed at the central analytical laboratory. Results were good with only
minor items noted for improvements.
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 21 of 25
AIR TOXICS
The Acme Reporting Organization monitors the ambient concentrations of air toxic compounds. Three
different methods are used, depending on the class of air toxic compound. Volatile organic compounds
(VOCs), excluding carbonyls, are measured by continuous analyzers (on-line gas chromatographs) at
selected sites. The remaining sites use automated samplers to collect VOC cannister samplers which are
then transported to the laboratory for analysis. Carbonyls are collected with adsorbent sampling tubes,
which are transported to the laboratory for analysis. Inorganic compounds are collected on PM
2.5
filters
(as part of particulate criteria pollutant monitoring) and analyzed (after weighing for PM
2.5
mass) by
inductively coupled plasma mass spectrometry (ICP MS). This monitoring is conducted as part of the Air
Toxics monitoring network.
PROGRAM UPDATE
At the beginning of 2000, the Acme Reporting Organization operated five ambient air monitoring stations
that measured ambient air toxics. On March 1, 2000, a station was opened at Townone to monitor air
toxics.
QUALITY OBJECTIVES FOR MEASUREMENT DATA
The Quality Objectives for the Acme Reporting Organization's ambient air monitoring of ambient air
toxics are shown in Table 12, below.
Table 12. Quality Objectives for Air Toxics
Data Quality Indicator Objective
Precision 25%
Bias 25%
Completeness 75%
Promptness 100%
DATA QUALITY ASSESSMENT
Summary
Assessment of the data quality for ambient air toxics showed that all instruments met goals for accuracy,
precision, completeness, and promptness. System audits showed siting problems at two sites, both of
these were corrected promptly.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 22 of 25
Promptness and Completeness
At least 75 percent of scheduled monitoring data must be reported. All data must be submitted within six
months after the end of the reporting quarter. Table 13 summarizes promptness and completeness for
ambient air toxics monitoring data.
Table 13. Data Quality Assessment for Promptness and Completeness
Pollutant Promptness Completeness
Carbonyls 100% 78%
Volatile organic
compounds
100% 84%
Inorganic compounds 100% 87%
Precision
At least once every two weeks, precision for automated VOC analysis is determined by sampling a gas of
known concentration. Precision for manual VOC sampling, carbonyl sampling, and inorganic sampling is
obtained by analysis of duplicate samples. Duplicates are taken at a frequency of one duplicate for every
10 samples. Table 14 summarizes the precision check results for 2000.
Table 14. Data Quality Assessment for Precision
Pollutant
Precision checks
completed
Precision checks
within limits
Carbonyls 91% 90%
Volatile organic
compounds
98% 97%
Inorganic compounds 90% 91%
Bias
The results of the annual performance audits conducted by ARO personnel are shown in Figure 9, below.
For the automated VOC analyzers, the center line represents the average bias across all sites (i.e., with all
sites weighted equally). For the carbonyl, manual VOC, and inorganic analyses, the center line represents
the average of all audit samples for the central analytical laboratory. The lower and upper probability
limits represent the boundaries within which 95 percent of the individual bias values are expected to be
distributed. All measurements represent the average of all audit species.
QA Handbook Volume II, Appendix I
Revision No. 0
Date: 05/13
Page 23 of 25
Figure 10 shows the results of the external performance audits performed with the National
Performance Audit Program (NPAP) which is administered by the U.S. EPA.
QA Handbook Volume II, Appendix I
Revision No. 0
Date:05/13
Page 24 of 25
System Audits
Systems audits were performed at two sites during the calendar year 2000. These audits
evaluated areas such as siting criteria, analyzer and sampler operation and maintenance, operator
training, recordkeeping, and serve as a general review of site operations. No significant
problems were found, except for the following:
The Townsix site had vegetation which had grown too close to the analyzer inlet probes. The
vegetation was removed within one week, with assistance from the County Parks and
Recreation Department.
A systems audit was also performed at the central analytical laboratory. No significant problems
were found.
Example of Corrective Action Form
A corrective action request should be made whenever anyone in the reporting organization notes
a problem that demands either immediate or long-term action to correct a safety defect, a
operational problem, or a failure to comply with procedures. A typical corrective action request
form, with example information entered, is shown below. A separate form should be used for
each problem identified.
The corrective action report form is designed as a closed-loop system. First it identifies the
originator, that person who reports and identifies the problem, states the problem, and may
suggest a solution. The form then directs the request to a specific person (or persons), i.e., the
recipient, who would be best qualified to "fix" the problem. Finally, the form closes the loop by
requiring that the recipient state how the problem was resolved and the effectiveness of the
solution. The form is signed and a copy is returned to the originator and other copies are sent to
the supervisor and the applicable files for the record.
QA Handbook Volume II, Appendix I
Revision No. 1
Date:12/08
Page 25 of 25
ARO - Corrective Action Request
Part A - To be completed by requestor
To: John S. Visor
Organization Responsible for Action ARO Ambient Air Monitoring Section
Urgency:
Emergency (failure to take action immediately may result in injury or property damage)
Immediate (4 hours) Urgent (24 hours) Routine (7 days)
As resources allow For Information only
From: William Operator phone: (000) 555 - 1000
fax: (000) 555 - 1001 e-mail: billo@localhost
Copies to:
(Always send a copy to the ARO Site Coordinator at 115 Generic Office Building, Townone XX, 00001)
Problem Identification
Site(Location): Townsix site
System: sample inlet
Date problem identified: Aug. 1, 2000
Nature of problem: Glass sample inlet and dropout trap broken during removal
of weeds from site
Recommended Action: Replace broken parts
Signature: William Operator Date: Aug. 1, 2000
Part B - to be completed by responsible organization
Problem Resolution
Date corrective action taken: August 4, 2000
Summary of Corrective Action: Replacement parts were ordered and received. The new
parts were installed within three days of the request. Data from the days with a cracked sample inlet will
be flagged as questionable.
Effectiveness of corrective action: Sample inlet restored to new condition.
Signature: John Visor Date: Aug. 4, 2000
Phone: (000) 555 - 2000 Fax: (000) 555 - 2001
e-mail: jsv@localhost
Send copies of the completed form to the requestor and the ARO Site Coordinator at 115 Generic Office Building, Townone
XX, 00001)
ARO form CAR-1 , May 1, 1999
United States
Environmental Protection
Agency
Office of Air Quality Planning and Standards
Air Quality Assessment Division
Research Triangle Park, NC
Publication No. EPA-454/B-13-003
May, 2013