microsoft word - attachment a tribe program level...

57
[Project name] [Section Number], [Revision Number] A1. Title Page and Approval Page [OPTIONAL - INCLUDE TRIBE LOGO HERE] [AIR, LAND, WATER] TRIBAL ENVIRONMENTAL PROGRAM-LEVEL QUALITY ASSURANCE PROJECT PLAN Prepared by _[Insert Name of organization conducting work and mailing address] Prepared for Draft 8-5-13

Upload: vanduong

Post on 30-Jan-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

A1. Title Page and Approval Page

[OPTIONAL - INCLUDE TRIBE LOGO HERE]

[AIR, LAND, WATER]

TRIBAL ENVIRONMENTAL PROGRAM-LEVEL

QUALITY ASSURANCE PROJECT PLAN

Prepared by_[Insert Name of organization conducting

work and mailing address]

Prepared for

Date (Month, Day and Year)

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

A1. (continued) Title Page and Approval Page

Name: [Tribe’s Chairperson/Chief]Title: Tribal Council ___________________________________________________________Organization: _________________________________________________________________Telephone: _______________________________________________________________Email: _______________________________________________________________Signature: _______________________________________________ Date: __________

Name: [Tribe’s Project Manager]Title: Project ManagerOrganization: _______________________________________________________________Telephone: _______________________________________________________________Email: _______________________________________________________________Signature: _______________________________________________ Date: __________

Name: [Tribe’s Quality Assurance Manager]Title: QA ManagerOrganization: _______________________________________________________________Telephone: _______________________________________________________________Email: _______________________________________________________________Signature: _______________________________________________ Date: __________

Name: [Preparer of Document other then Tribe]Title: Organization: Telephone: _______________________________________________________________Email: _______________________________________________________________Signature: _______________________________________________ Date: __________

Name: ___________________________________Title: Project Manager (if Designated Approving Official)Organization: USEPATelephone: _______________________________________________________________Email: _______________________________________________________________Signature: _______________________________________________ Date: __________

Name: ___________________________________________________Title: Quality Assurance Manager or Designated Approving OfficialOrganization: USEPATelephone: _______________________________________________________________Email: _______________________________________________________________Signature: _______________________________________________ Date: __________

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

A2. Table of Contents

ELEMENT SECTION PAGE

A. PROJECT MANAGEMENT

A1. TITLE AND APPROVALA2. TABLE OF CONTENTSA3. DISTRIBUTION LISTA4. PROJECT/TASK ORGANIZATIONA5. PROBLEM DEFINITION / BACKGROUNDA6. PROJECT / TASK DESCRIPTIONA7. SPECIAL TRAINING AND CERTIFICATIONA8. DOCUMENTATION AND RECORDS

B. DATA GENERATION AND ACQUISITION

B1. SAMPLING DESIGN AND SITE FIGURESB2. SAMPLING AND ANALYTICAL METHODS REQUIREMENTSB3. SAMPLE HANDLING AND CUSTODY REQUIREMENTSB4. ANALYTICAL METHODS REQUIREMENTSB5. FIELD QUAILITY CONTROL REQUIREMENTSB6. LABORATORY QUALITY CONTROL REQUIREMENTSB7. FIELD EQUIPMENT CALIBRATION AND CORRECTIVE ACTIONB8. LABORATORY EQUIPMENT CALIBRATION AND CORRECTIVE ACTIONB9. ANALYTICAL SENSITIVITY AND PROJECT CRITERIAB10. DATA MANAGEMENT AND DOCUMENTATION

C. ASSESSMENT/OVERSIGHT

C1. ASSESSMENT AND RESPONSE ACTIONSC2. REPORTS

D. DATA VALIDATION AND USABILITY

D1. DATA REVIEW, VALIDATION AND VERIFICATION

A3. Distribution List

This list includes the names and addresses of those who receive copies of this approved Tribal Programs Level QAPP and subsequent revisions.

Name: ________________________Title: Quality Assurance Manager, TribeAddress: ______________________

____________________________________________

Name: _______________________Title: Project Manager, TribeAddress: ______________________

____________________________________________

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Name: ___________________________________Title: Project Manager, USEPAAddress: ______________________

____________________________________________

Name: ___________________________________________________Title: Quality Assurance Manager or Designated Approving Official, USEPAAddress: ______________________

____________________________________________

Name: ___________________________________________________Title: Field Leader Address: ______________________

____________________________________________

Name: ___________________________________________________Title: Laboratory Manager/LeaderAddress: ______________________

____________________________________________

A4. Project/Task OrganizationThe individuals or organizations participating in the project and their specific roles and responsibilities are provided below. A project chart is provided following the specific roles and responsibilities. Also identify the individuals who have stop work authority if the QAPP is not being followed properly and/or health and safety issues warrant work stoppage.Following is a list the key project personnel and their corresponding responsibilities.

[You can insert an organizational chart if it is helpful to you]

Local Community Elder(s) Responsibilities[List the names and titles of local tribal elders that will be involved in making decisionsregarding this project.] The main responsibility of the local community elder may include providing historical and cultural information. Elders will assist the tribe in making decisions that will affect the tribe’s cultural heritage.

Project Manager (PM) Responsibilities[Name of the person who will serve as the Tribe’s Project Manager.] The Project Manager is the primary contact for technical objectives, sampling, analytical procedures, QA requirements, problem resolution and general implementation of the QAPP. The Project Manager oversees project efforts and other project activities, provides and ensures that each of the project team

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

members conducting project activities has completed all required training and refresher requirements.

Project Quality Assurance Manager Responsibilities[Name of the person that will be the Tribe’s QA Manager.] The QA Manager prepares the project QAPP and its subsequent revisions. The QA Manager ensures that the QAPP incorporates adequate QA and QC measures to meet the data quality objectives set forth by the project and the program and that the QAPPs are timely reviewed and approved by appropriate approving personnel. The QA Manager also ensures that the QA/QC measures specified in the QAPP are effectively implemented throughout the duration of the project. The QA Manager coordinates and facilitates technical, performance and quality system audits conducted by appropriate authorities at the project-specified frequency.

Field/Sampling Leader (FSL) Responsibilities[List the name of the Field or sampling leader(s).] The Field/Sampling Leader is responsible for the timely completion of assigned fieldwork with strict adherence to the QAPP’s activity/taskschedules, SOPs and sample chain-of custody documentation. The Field’s Sampling Leader willperform the following duties:

1. Select the field team.

2. Conduct the field activities per the approved QAPP and supervise the field sampling team.

3. Upon receipt from the Project Manager, distribute the approved QAPP and subsequent revisions to the members of the field sampling team.

4. Report problems in the field to the Project Manager.

5. Implement corrective actions in the field as directed by the Project Manager.Corrective actions will be documented in the field logs and provided to the ProjectManager.

Laboratory Manager/Leader Responsibilities[Specify the name of the Laboratory that will perform the analyses for this project. Specify the name, phone number and/or e-mail address of the contact for the laboratory.] This individual will be responsible for coordinating the analysis of the samples and laboratory verification of the data. He/she will coordinate the receipt of the samples at the laboratory, select the analytical team, ensure internal laboratory audits are completed per the Laboratory’s Quality Assurance Manual and distribute the applicable sections of the QAPP and subsequent revisions to members of the analytical team. He/she is responsible for instituting corrective actions for problemsencountered in the chemical analyses and will also report laboratory problems affecting the project data to the Project Manager and QA/QC Manager. Corrective actions for chemical analyses will be detailed in a QA report that will be provided to the Project Manager via electronic and conventional mail.

EPA Region 4 Project Officer[Specify name,] EPA Project Officer has the responsibility to oversee and monitor the

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

grant. As part of that responsibility he/she must ensure the process described in the work plan is followed and the terms and conditions of the grant are met.

EPA Region 4 Quality Assurance Coordinator or Designated Approving Official[ Specify name,] Region 4 Quality Assurance Coordinator or Designated Approving Official (DAO) provides a technical assistance role to Region 4 Project Officer working on projects/sites. The DAO’s role is to provide technical reviews of the Program Level QAPPs and Project/Site-Specific QAPPs that are generated. This includes the approval of Program Level QAPP and Project/Site-Specific QAPPs.

A5. Problem Definition/BackgroundThe Problem Definition will be addressed in the Project/Site-Specific QAPP.

A6. Project/Task DescriptionThe project/task description and timeline will be provided in a Project/Site-Specific QAPP.

The project/task description will address items such as the project objectives and goals; the tasks and activities that will be conducted to achieve the project goals; field work that will be conducted; rationale for sample locations; environmental samples that will be collected as appropriate for soil, surface water, subsurface soils, groundwater, waste matrices, etc.; sample management and handling; laboratory analysis; data review and evaluation; and reporting.

A timeline will address the overall projected schedules or timelines for conducting the tasks.

A7. Special Training and CertificationTraining needs are determined by requirements to conduct air, land and/or water projects and by a review and evaluation of specific site conditions and proposed activities. In general, all fieldpersonnel must have current 40‐hour Hazardous Waste Operations and Emergency Response(HAZWOPER) certification and up to date 8-hour refreshers, as applicable, in accordance with40 CFR Part 311 and 29 CFR 1910.120. All field team members will have orientation as to the content and importance of this QAPP and each Project/Site-Specific QAPP including items such as, SOPs, the work plan, and Health and Safety Plan (HASP). If applicable, a review of state and federal license or certification requirements to perform certain duties that will be conducted for each project and stated in the Project/Site-Specific QAPP.

A8. Documentation and RecordsDocument generation and control is accomplished by well‐defined filing procedures using a well‐organized filing system. All project documents will be filed following the procedures of [Name the system] standardized project filing system. Original documents held by [Na m e of enti ty that wil l hold the f i les] and will be located in the [ N a m e t h e l o c a t i o n ] office. Documents will be maintained for a period of [indicate years to be held – check with EPA Project Officer] years. The following is an example but not inclusive of documents that may be generated and located in the project files:

• A study plan;• Original chain-of-custody forms and field logbooks;

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

• All records obtained during the investigation;• Boring Logs, Monitor Installation Forms, Monitor Sampling Forms, etc;• A complete copy of all analytical data and transmittal documents;• Progress and/or status reports;• Data validation/data quality assessment reports;• Any other relevant documentation including photographs, letters, memos, and video

and audio media;• Copies of complete assessment reports; and• Project audit and QA reports.

Laboratory analysis will be performed in a standard turn-around time and electronic data and hard copies received no later than 35 calendar days. Both the electronic data report and hardcopy will include the following:

• Job description and number;• Field sample and lab sample identification;• Analytical method, method detection limit, and reporting limit;• Sample matrix;• Sample collection date and analysis date;• Analyst identification;• Analyte name and result with units;• Dilution factors;• Quality control results; and• Chain-of-custody records (included sample preparation and analysis date and time for

each sample. Analytical method numbers, extraction and digestion method numbers used for each sample, etc.)

B. DATA GENERATION AND ACQUISITION[Methods described in the Group B elements should have been summarized earlier in elementA6. Designated methods should be both well documented and readily available or attached to this QAPP.]

B1. Sampling Design and Site FiguresThe sampling design process and site figures will be provided in the Project/Site-Specific QAPP. The network design components should comply with the recommendations in 40 CFR Part 58 Appendix D.

B2. Sampling and Analytical Methods RequirementsAll samples will be collected in a manner consistent with the media being sampled and the analytes of interest. The collection methods will follow those outlined in the EPA Region 4SESD, Field Branches Quality System and Technical Procedures, the most recent version.

Non‐typical sampling SOPs will be described in the Project/Site-Specific QAPP.

A table will be provided of sampling equipment on particular project and be provided in the Project/Site-Specific QAPP.

B3. Sampling Handling and Custody RequirementsField and laboratory personnel will be aware, at all times, of the need to properly maintain all

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

samples, whether in the field or in the laboratory, under strict chain‐of‐custody protocols and in a manner to retain physical sample properties and chemical composition. The handling and transportation of samples will be accomplished in a manner that not only protects the integrity of the sample, but also documents sample custody. Packing, marking, labeling, and shipping of samples will follow those outlined in the EPA Region 4 SESD, Field Branches Quality System and Technical Procedures.

B4. Analytical Methods Requirement

All analytical methods used on samples will comply with relevant requirements of applicable federal programs (e.g., Clean Water Act [CWA], Safe Drinking Water Act [SDWA], Resource Conservation and Recovery Act [RCRA], Comprehensive Environmental Response, Compensation, and Liability Act [CERCLA], Clean Air Act [CAA]) for which they were collected, or alternative EPA‐approved methods. The list of approved analytical methods is subject to routine updates; therefore, the most recently approved methods will be verified.

Non‐standard or unpublished methodologies for analysis are not anticipated. Should unexpected requirements for non‐standard or unpublished analytical methodologies arise, they will be discussed in the Project/Site-Specific QAPP along with the reasoning behind the method use and the validation criteria required.

The Tribe’s QA Manager will be responsible for overseeing the success of the analysis and for implementing corrective actions (if warranted).

Laboratory analysis will be performed in a standard turn‐around time and electronic data and hard copies received no later than 35 calendar days.

B5. Field Quality Control RequirementsThis section defines the quality control requirements for field sampling activities including QC sample selection and data quality objectives (DQOs). Quality control in the field will be conducted in accordance with the following SOP and will follow those outlined in the EPA Region 4 SESD, Field Branches Quality System and Technical Procedures.

Data quality will be assessed using QC samples and will be selected for each project based on the project DQOs, project‐sampling procedures, and established analytical method requirements. QC samples will be collected to verify the validity of analytical results and to assess whether the samples were contaminated from sources not directly attributable to releases at the site (such as improper decontamination, cross contamination, etc.). Field QC samples will include trip blanks, field blanks, equipment blanks/rinsate samples, and field duplicates, as appropriate. The field QC samples proposed for collection will be identified in the Project/Site-Specific QAPP addendum.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

F i e l d Dupli c a t e S a m pl e s : A field duplicate is a second sample collected at the same location as the original sample and will be used to assess sampling and laboratory precision. Duplicate samples will be collected simultaneously or in immediate succession, following identical collection procedures, and treated in the same manner during sample shipment, storage and analysis. The sample containers will be assigned an identification number in the field such that they cannot be identified (blind duplicate) as duplicate samples by laboratory personnel. Field duplicate samples will be collected at a 10 percent frequency.

Equip m ent Rinsate Sa mp les: The equipment rinsate blank is a sample of organic‐free (deionized) water that is prepared in the laboratory, shipped to the site with other sample containers, and is poured over the cleaned, decontaminated sample collection equipment in between sample collection. The equipment rinsate blank will be used to evaluate potential cross‐contamination that may occur by reusing sample collection equipment if not thoroughly decontaminated between sample collection events.

Trip B l ank: Trip blanks are supplied by the designated laboratory and consist of deionized water in a 40‐ml vial. The trip blank will remain in the sample ice chest along with the investigation samples and will be analyzed for target volatile compounds only.

Tempera t u r e B l ank S amp l e s: Temperature blank samples will also be supplied by the laboratory and will accompany each ice chest. The laboratory will utilize the temperature blank samples for measurement of the temperature within the cooler upon arrival at the laboratory.

QC Samples A minimum of one set of precision QC samples for each media (groundwater, surface water, soil/sediment, air) will be collected per site. Where both soil and sediment are sampled, field personnel will collect the replicate split of whichever media is sampled most at a given project. All QC samples will be documented in the sampling report.

Data quality indicators (DQIs) will be used to evaluate the quality of the data including precision, bias, accuracy, completeness, representativeness, comparability, and sensitivity.

The following actions will be taken when control limits are exceeded or interferences or dilution problems are encountered or low enough equipment sensitivity problem exists:

1) Review data outliers with the lab;2) Determine if re-analysis or re-sampling is required;3) Flag data in the report and explain; and4) Indicate whether data can be used (as indicator), relied upon, or must be rejected.

B6. Laboratory Quality Control Requirements[Name of laboratory] has a QC program in place to ensure the reliability and validity of the analysis performed at the laboratory. All analytical methods are documented in laboratory

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

SOPs. Each SOP includes a QC section, which addresses the minimum requirements for the procedure. These SOPs will be presented upon request. Laboratory SOPs will be included in an appendix to the Project/Site-Specific QAPP.

B7. Field Equipment Calibration and Corrective ActionThe calibration and maintenance of field and laboratory instrumentation is an important aspect of the project’s overall QA/QC program. All field and laboratory instrumentation will be calibrated prior to use and continual as necessary. Initial and continuing calibration of field and fixed laboratory instruments will be performed in accordance with the method-specific requirements. All initial and continuing calibration procedures will be implemented by trained personnel following the manufacturer's instructions and EPA specification to ensure the equipment is functioning within the tolerances established by the manufacturer and the EPA method‐specific analytical requirements.

All documentation pertinent to the calibrations and/or maintenance of field equipment will be maintained in a dedicated, active field logbook and the Calibration and/or Field Instrument Maintenance Record. Entries made into the forms regarding the status of any field equipment will contain, but are not necessarily limited to, the following information:

• Name and unique identification number (e.g., serial number) of instrument being calibrated;

• Date and time of calibration;• Name of person conducting calibration;• Reference standard name and concentration used for calibration (calibration standards

used along with concentrations;• Calibration procedure followed including expiration date of calibration standards;• Continuing calibration check and procedures followed;• Description of maintenance or repair (if applicable);• Date and time that instrument taken out of service and returned to service (if applicable);

and• Other pertinent information.

Equipment that fails calibration and/or becomes otherwise inoperable during the field investigation will be removed from service and segregated to prevent inadvertent use. Such equipment will be properly tagged to indicate that it should not be used until the nature of the problem can be ascertained. The Tribe’s Project Manager will approve any equipment requiring repair or re‐calibration before such equipment is returned to service. Equipment that cannot be repaired or recalibrated will be replaced.

B8. Laboratory Equipment Calibration and Corrective ActionLaboratory services will be provided by [Name of laboratory]. All laboratory equipment and instruments will be tested, inspected, calibrated, and maintained in accordance with method-specific requirements or manufacturer’s specifications/recommendations. SOPs for laboratory equipment/instrument, analysis, and corrective actions are included in Appendix . The laboratory equipment/instrument, supplies, methods and procedures of sample handling, analysis, and reporting, will be in accordance with the requirements of the EPA methodologies. Initial and continuing calibration checks will be identified in Name of laboratory QAM. Furthermore, actions taken by Name of laboratory with regard to corrective actions will be addressed in Name of laboratory’s Quality Assurance Manual.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

B9. Analytical Sensitivity and Project Criteria

The analytical sensitivity and project specific detection limit and QA/QC requirements criteria will be provided in a Project/Site-Specific QAPP.

B10. Data Management and Documentation

Data management in general, including field records, contaminants of concern review and correction, data review, reduction and transfer to data management systems, quality control charts, quality control procedures, and sample receipt, storage and disposal, will be performed in accordance with accepted industry practices.

Documentation will be in accordance with applicable SOPs and accepted industry practices and will include the field logbooks and data sheets, sampling reports, copy of the COC, and field QA controls with analytical results. All project documentation generated will be maintained in the project file.

Data reporting included in the final report will include, at a minimum:

Sample documentation (i.e. location, date, and time of collection and analysis, etc.); COC documentation; Initial and subsequent equipment/instrument calibration; Determination and documentation of detection limits; Analyte(s) identification; Analyte(s) quantitation; QC sample results; and Duplicate results.

Precautions will be taken during the reduction, manipulation, and storage of data to prevent the introduction of errors or the loss or misinterpretation of data.

All field sample documents will be legibly written in ink. Any corrections or revisions to sample documentation will be made by lining through the original entry and initialing and dating any changes. If difficulties are encountered during sample collection or sample analysis, a brief description of the problem will be provided in the sampling report submitted by the Tribe.

A project‐specific field log book will be maintained to detail site activities and observations so that an accurate and factual account of field/site conditions and field procedures may be reconstructed. All field logbook entries will document (at a minimum) the following:

Site name and project number; Names of personnel on site; Dates and times of all entries; Descriptions of all site activities including site entry and exit times; Noteworthy events and discussions; Weather conditions; Site observations; Identification and description of samples and locations; Contractor (if applicable) information and names of on‐site personnel; Dates and times of sample collections and COC information;

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Records of photographs; Site sketches; and All relevant and appropriate information pertaining to sample shipment.

All field documents will be reviewed weekly by the Tribe’s QA Office/Manager for completeness and accuracy. Copies of the boring logs and sample collection logs will be included in the final report.

[Name of laboratory] will be responsible for providing electronic data deliverables (EDDs) and hardcopy analytical reports. The ha rdcopy da t a deliverable will include final results, analytical methods, detection limits, surrogate recovery data, method blank, and QC sample results. A narrative discussion of special analytical problems and/or modifications to the standard analytical method will be discussed. Data will be reported in units commonly used for the analysis performed. The number of significant figures reported will be consistent with the limits of uncertainty in the analytical method performed. Typical laboratory data reports will include the following:

Project narrative including an explanation of any qualified data and observations or deviations encountered during the analysis; and

Data results sheets (including collection, laboratory receipt dates, preparation and analysis dates, sample concentrations, units, reporting limits, and standard QC results.

Initial and continuing calibration results and acceptance limits; All other method QC data; and All sample preparation and analytical raw data.

The raw data in the form of instrument printouts (chromatograms, count reports, etc.) will be included in the hardcopy data package. Electronic summary spreadsheets containing raw data will also be provided. These summary tables will be compared to original laboratory data by t h e laboratory QA/QC reviewer prior to incorporation into reports.

QA/QC reviews will confirm the following objectives:

Adherence to the approved sampling plan; Equipment and equipment/instrumentation validation; Adherence to proper sample collection techniques; Comparison of data with QC criteria; Review of COC documentation; Review of sample shipment; and Review handling procedures including verification that the holding time preservation,

container and volume requirements were met.

The laboratory QA/QC reviewer may identify the need for corrective action during either thedata review/validation or data the field personnel (e.g., missed holding times or samples analyzed for incorrect analytes) or re‐extraction/re‐analysis of samples (e.g., surrogate recoveries outside criteria or sample require dilution for analyte quantitation) by the laboratory. These measures are dependent upon the ability to mobilize the field personnel and whether the data to be collected is necessary to meet the required project QA objectives (e.g., holding time was not exceeded for samples to be re‐analyzed, etc.). If the Tribe’s QA Manager identifies a corrective action situation, the Tribe’s Project Manager will be responsible for approval of corrective actions and also ensuring the implementation of the corrective measure.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Any corrective action that requires re-sampling or changes to the work plan or Project/Site-Specific QAPP will be defined as a major corrective action. Major corrective actions include, but are not limited to, measures that change the number of samples to be collected, alter previously selected sampling locations or any corrective action that impacts the project quality control objectives or site decisions. The Tribe’s Project Manager will be responsible to contact the EPA Project Officer/Manager and discuss all major corrective actions. Major corrective actions will be approved by the EPA Project Officer/Manager before implementation by the Tribe’s Project Manager. Electronic copies of the laboratory reports, spreadsheets, and databases will be maintained within the appropriate folder on the Tribe’s computer server. Software typically used includes Microsoft Word, Excel, or Access or SEDDs. Hard copies of laboratory reports and data tables will be maintained in the project file and will be included in the final report.

The final project file will be the central repository for all documents that constitute information or data relevant to sampling and analysis activities associated with the project. The Tribe is the custodian of the project file and maintains the contents of project files for the project including all relevant documents in a secured, limited‐access area and under custody of the Tribe’s Project Manager. The final project file will be maintained until [___ years] following termination of the project, or as required by the specified program (air, water, etc.). The final project file will include at a minimum:

Field logbooks; Field data and data deliverables; Photographs; Drawings; Laboratory data deliverables; Data validation reports; Data assessment reports; Progress reports, QA reports, interim project reports, etc.; and All custody documentation (e.g., COCs, tags, forms, etc.).

C. Assessment and Oversight

C1. Assessment and Response Actions

Project Level Assessments (Internal assessments of project)

• Depending on the frequency of the sample collection and measurement activities, split samples shall be collected and analyzed by a laboratory. Split confirmatory samples shall also be sent to

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

the laboratory in case questionable/anomalous high results were obtained in the field. Labs will be selected based on the consideration of the following factors:

• Three levels of data verification shall be employed, i.e., during sample collection, data documentation and data entry and generation processes. Data that did not meet the DQO of the project are appropriately flagged or qualified in the database during the data validation process.

• The Tribe’s Project Manager will review this QAPP and the overall project design annually and may suggest procedural refinements or additional testing procedures. This may include new parameters to be measured or changes to procedures currently in use.

Any such changes will be subject to EPA and [Name of entity] approval. The project is open toEPA or [Name of entity] system audits at their discretion.

• An internal QA assessment will be conducted by the Tribe’s QA Manager to assess the progress and effectiveness of the project annually.

Program Level Assessments (External assessments of program)• A review of the project will be conducted by EPA to assess the progress and effectiveness of the project annually, or as requested by the specific program (air, water, etc).

Response and Corrective ActionsProblems encountered during sample collection and data generation shall be handled accordingly and as soon as possible. No measurements will be generated by an instrument or piece of equipment that did not meet the technical specifications of the manufacturer or the method SOP.

Problems that may have a big impact on data quality shall be properly documented and resulting data will be qualified/flagged accordingly. A list of the data qualifier flags and their definitions, must be included in the project or site-specific QAPP.

Any failure to meet data quality objectives will be evaluated. If the cause is found to be equipment failure, calibration and maintenance procedures will be reassessed and improved. If the problem is found to be personnel error, personnel will work with the Project QA Officer/ to resolve the problem. If accuracy and precision goals are frequently not being met, QC sessions will be scheduled more often.

If failure to meet program specifications is found to be unrelated to equipment, methods, or personnel error, the QAPP may be revised. Revisions and subsequent modifications and amendments to this QAPP shall be submitted to the EPA Project Officer/Manager to have reviewed and approved.

C2. ReportsAnnual reports will be produced and submitted on [fill in blank] of each year and will describe

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

activities during the previous calendar year. These reports will consist of data results, interpretation of data, information on project status, highlights, results of QC audits and internal assessments.

The project personnel are responsible for report production and distribution. Annual reports will be forwarded to the regional office of EPA as specified by the EPA Project Officer/Manager.

D. Data Validation and Usability

D1. Data Review, Validation and VerificationAll data review, verification, validation and assessment SOPs are included in the Program Level QAPP and Project/Site-Specific QAPP.

Field Data ValidationThe Tribe’s Project Manager will validate the field data and discuss any problems identified during the project with the Tribe’s Field Team Leader. Data will be reviewed for integrity by checking all field entries for errors and consistency. Data validation will be accomplished through a series of checks and reviews intended to assure that the reported results are of a verifiable, reproducible, and acceptable quality.

A data usability review that includes an assessment of field procedures (including field notes, field screening results, and field analytical data, etc.); completeness, comparability, representativeness, precision, and bias (accuracy) of the data will be performed by the Tribe’s Project Manager. The findings of this review will be documented and presented in the final report. (Include a Data Quality Object (DQO) table or worksheet.)

If verification or validation indicates that samples have been collected and/or analyzed out of compliance with the QAPP (for instance deviations from the acceptance criteria for quality control defined in this QAPP and its addendums), re-sampling may be required. The Tribe’s Project Manager must contact the EPA Project Officer in the event that there are any deviations from the QAPP and EPA Project Officer will determine if the data is acceptable or if resampling is required. If data is accepted that deviates from the QAPP, the data will be used for screening purposes only and annotated as such. (Note: The limitations of the data must be described in the final report and if the data is used for screening purposes only site decisions will be affected.)

Laboratory Data EvaluationThe Laboratory Director/QA Manager will review and verify the laboratory data generated under their corrective action system for accuracy according to the laboratory’s QMP. Quality control checks are performed on field data by reviewing the chain of custody forms and results from the lab for each sampling event. All sample results will be reviewed and correlated to field measurements and observations. The validation process will include:

Narrative review Quality control blanks meet criteria Quality control data (spikes, duplicates) are acceptable Surrogate spike recoveries are acceptable Unacceptable data are identified and corrective actions are initiated Data qualifiers are assigned (by lab) if necessary:

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

(Note: Data should be verified by laboratory staff/management. Data validation should be performed by an organization external to the laboratory. The final set of data qualifier flags should be assigned by independent data validators). [Name of laboratory] QAM Section , section title.

In addition to evaluating data qualifiers associated with laboratory analyses, a comparison of the sample duplicate(s) and the corresponding sample result(s) will be made to evaluate the reproducibility of the sample results based on the laboratory analysis and sample collection and transportation procedures. For this comparison, if the duplicate or sample result is less than five (5) times the reporting limit then the comparison is made by the absolute difference between the results (S-D). For water samples, if this difference is less than the magnitude of the (higher) reporting limit, precision is considered “acceptable”. For soil samples, if the difference is less than twice the magnitude of the (higher) reporting limit, precision is considered “acceptable”. If these differences are within two times (2X) the “acceptable” limits, they are considered “slightly high”; anything beyond that would be considered “high”. If both sample and duplicate results are greater than five times (5X) the reporting limit (the higher of the two RLs, if they’re not the same), then precision is assessed by the %RPD (difference in results divided by the average of the two results X 100); <35% RPD = “good/acceptable”, >35% but < 50% = variability is “slightly high”, >50% = “high”.

Based on the data qualifiers provided by the laboratory, and on the sample/sample duplicate comparison described above; data will be categorized as fully quantified, qualified, or unusable. Unusable data will not be utilized in the project decision process. Raw data will be included in all submitted project reports.

Information for this section to be provided by EPA Region 4 SESD in the near future.

The data usability will compare proposed sample locations to actual sample locations. The review also will verify that the predefined number of samples were analyzed and will confirm that the predefined analytical methods and detection limits were used. The Project Manager will review the quality control samples, hold times, calibration, surrogate recovery, as well as the precision and accuracy data for the sampled analytes of concern to determine whether the data will be accepted or rejected. In the event results are rejected, the Tribe’s Quality Assurance Manager, Tribe’s Project Manager, and EPA’s Project Officer/Manager will meet to discuss the reasons for the rejection of data and what steps should be initiated including additional site sampling if deemed necessary. Problems associated with the laboratory’s analysis of site samples will be documented in the laboratory QA report provided with all analytical results, which will be provided to all end users in the form of summary reports. This is generated by the Tribe’s Project Manager and includes the following validation activities:

[Figure or Table ] Validation ActivitiesItem ActivityData Deliverables and QAPP Ensure that all required information on sampling and analysis was provided

(including planning documents).Analytes Ensure that required lists of analytes were reported as specified.Chain-of-Custody Examine the traceability of the data from time of sample collection until

reporting of data. Examine chain-of-custody records against contract, method, or procedural requirement.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Holding Time Identify holding time criteria, and confirm whether they were met or documentany deviations. Ensure that samples were analyzed within holding times specified in method, procedure, or contract requirements. If holding times were not met, confirm that deviations were documented, that appropriate notifications were made (consistent with procedural requirements), and that approval to proceed was received prior to analysis.

Sample Handling Ensure that required sample handling, receipt, and storage procedures were followed, and that any deviations were documented.

Sampling Methods andProcedures

Establish that required sampling methods were used and that any deviations werenoted. Ensure that the sampling procedures and field measurements met performance criteria and that any deviations were documented.

Analytical Methods andProcedures

Establish that required analytical methods were used and that any deviations werenoted. Ensure that the QC samples met performance criteria and that any deviations were documented.

Data Qualifiers Determine that the laboratory data qualifiers were defined and applied as specified in methods, procedures, or contracts.

Deviations Determine the impacts of any deviations from sampling or analytical methodsand SOPs. Consider the effectiveness and appropriateness of any corrective action.

Sampling Plan Determine whether the sampling plan was executed as specified (i.e., the number, location, and type of field samples were collected and analyzed as specified inthe QAPP).

Sampling Procedures Evaluate whether sampling procedures were followed with respect to equipmentand proper sampling support (e.g., techniques, equipment, decontamination, volume, temperature, preservatives, etc.).

Co-located Field Duplicates Compare results of collocated field duplicates with criteria Established in theQAPP.

Project Quantitation Limits Determine that quantitation limits were achieved, as outlined in the QAPP andthat the laboratory successfully analyzed a standard at the QL.

Confirmatory Analyses Evaluate agreement of laboratory results.Performance Criteria Evaluate QC data against project-specific performance criteria in the QAPP (i.e.,

evaluate quality parameters beyond those outlined in the methods.).Data Qualifiers Determine that the data qualifiers applied were those specified in the QAPP and

that any deviations from specifications were justified.Validation Report Summarize deviations from methods, procedures, or contracts. Include qualified

data and explanation of all data qualifiers.

A table of Data Qualifier flags and definitions will be provided by EPA Region 4 SESD in the near future.

Data Usability and Project Verification

The Tribe’s Project Manager will validate the field data and discuss any problems identified during the project with the Field Team Leader. Any problems and associated corrective actions will be documented in the field activity report. The Tribe’s Project Manager will discuss any problems along with proposed corrective actions with the Tribe’s QC Manager.

The Laboratory Director/QA Manager will review and verify the laboratory data generated under their corrective action system for accuracy according to the laboratory’s QAM. Any problems identified during this process will be reported to the Tribe’s Project Manager in the analytical data report. Any additional information on QC criteria will be included in the Project/Site-Specific QAPP, if applicable.

The Laboratory Director/QA Manager will evaluate the sample/sample duplicate data and equipment blank data to determine if data precision is of an acceptable quality. Pending these three data validation procedures, data will be determined to be of a specified quality and reported

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

as such. For instance, data will typically be reported with no qualifiers if the data are determined to be fully useable. However, a discussion of data limitations will be added to the data summary tables and data discussion within the reports if data validity is compromised in any way.

Valid data of known and documented quality is required for all media sampled. Once reliable and representative data are obtained, the data will be compared to the CTLs to determine no further action is required or if active remediation is needed. The Tribe’s project Manager will reconcile the data with the project-specific objectives.

The process for reconciling the data includes the evaluation of the following questions:

1. Were the appropriate collection procedures followed with the samples collected?2. Were the samples collected handled in accordance with the SOPS?3. Were the samples collected from the pre-determined or specific sampling locations?4. Were the samples collected properly preserved?5. If applicable, were field sampling problems documented in field logs?6. Were the QAPP Specified analytical/extraction and /or digestion methods used and were

in accordance with the chain-of-custody procedures followed?7. Were any problems identified during laboratory preparation and analysis?8. Was the laboratory able to meet the MDLs, PQLs, and QA/QC requirements specified in

the Project/Site-Specific QAPP and was this documented?9. What were the results of data validation?10. Do any of the data points require rejection?11. If data is problematic, is re-sampling or reanalysis required?12. If data is rejected, how does the result affect the ability to make site decisions?

Because data generated with significant deviations from the requirements of the QAPP will be rejected and because of the nature of the work (biased sampling), all data will have the same expected uncertainties and there will be no limitations on data use. NOTE: All rejected data is considered unusable meaning this data cannot be used to quantitatively nor qualitatively to make decisions. The following is a list of considerations for data usability assessment:

[Figure or Table _] Data Usability Assessment

Item Assessment ActivityData Deliverables and QAPP Ensure that all necessary information was provided, including but not limited to

validation results

Deviations Determine the impact of deviations on the usability of data.Sampling Locations,Deviations

Determine if alterations to sample locations continue to satisfy the projectobjectives.

Chain-of-Custody, Deviation Establish that any problems with documentation of custody procedures do notprevent the data from being used for the intended purpose.

Holding Times, Deviation Determine the acceptability of data where holding times were exceeded.

Damaged Samples, Deviation Determine whether the data from damaged samples are useable. If the data cannot be used, determine whether resampling is necessary.

PT Sample Results, Deviation Determine if the implications of any unacceptable analytes (as identified by thePT sample results) on the usability of the analytical results. Describe any limitations on the data.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

SOPs and Methods, Deviation Evaluate the impact of deviations from SOPs and specified methods on data quality.

QC Samples Evaluate the implications of unacceptable QC sample results on the data usabilityfor the associated samples. For example, consider the effects of blank contamination.

Matrix Evaluate matrix effects (interference or bias).Meteorological Data and SiteConditions

Evaluate the possible effects of meteorological (e.g., wind, rain, temperature) andsite conditions on sample results. Review field reports to identify whether any unusual conditions were presented and how the sampling plan was executed.

Comparability Ensure that results from different data collection activities achieve an acceptablelevel of agreement.

Completeness Evaluate the impact of missing information. Ensure that enough information wasobtained for the data to be useable (completeness as defined in PWOs documented in the QAPP.

Background Determine if background levels have been adequately established (ifappropriate).

Critical Samples Establish that critical samples and critical target analytes/COCs, as defined in theQAPP, were collected and analyzed. Determine if the results meet criteria specified in the QAPP.

Data Restrictions Describe the exact process for handling data that do not meet PQOs (i.e., whenmeasurement performance criteria are not met). Depending on how those data will be used, specify the restrictions on the use of those data for environmental decision-making.

Usability Decision Determine if the data can be used to make a specific decision considering theimplications of all deviations and corrective action.

Usability Report Discuss and compare overall precision, accuracy, representativeness, comparability, completeness, and sensitivity for each matrix, analytical group,and concentration level. Describe limitations on the use of the project if criteria for data quality indicators are not met.

Re-sampling may be necessary if results are deemed unacceptable for various reasons such as exceeding laboratory holding times or to confirm previous sampling and/or excavation activities, etc. These variables will be further defined in the Project/Site-Specific QAPP when the project description is detailed based on the specific contaminants of concern. Upon receipt of the laboratory data, the data is reviewed to verify its usability. Upon determination, data is then formatted into tables and compared to regulatory limits to determine if contamination is present at the subject property. Most laboratories provide their data formatted in tables directly from their LIMS; this lessens the required manipulation of data and therefore provides a more accurate presentation of 3data. Upon completion of formatting the Analytical Data Table, data is reviewed for accuracy by the Tribe’s QA Manager. Site figures and maps including analytical results and sample locations are frequently prepared for submittal with final reports. These figures and maps are also reviewed for accuracy by the Tribe’s QA Manager.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

References

1. U.S. Environmental Protection Agency. 2006. Guidance on Systematic Planning Using the Data Quality Objectives Process. EPA QA/G-4 240/B-06/001. February.

2. U.S. Environmental Protection Agency. 2002. Guidance for Quality Assurance ProjectPlans. EPA QA/G-5. EPA 240/R-02/009. December.

3. U.S. Environmental Protection Agency. 2006. EPA Requirements for Quality AssuranceProject Plans. EPA QA/R-5. EPA 240/B/01/003. Reissued May.

4. U.S. Environmental Protection Agency. 2006. Data Quality Assessment: StatisticalMethods for Practitioners. EPA QA/G-9S. EPA 240-B-06-003. February.

5. U.S. Environmental Protection Agency Region 4, Science and Ecosystem Support Division (SESD), Field Branches Quality System and Technical Procedures, http://www.epa.gov/region4/sesd/fbqstp/index.ht m l .

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

List of Abbreviations

ASTM: American Society for Testing and MaterialsBS: Blank SpikeBSD: Blank Spike DuplicateBTEX: Benzene, Toluene, Ethylbenzene, and Total XylenesCD: Compact DiscCM: CentimetersCOC: Contaminants of ConcernCTL: Cleanup Target LevelsDAO: Designated Approving Official DEFT: Decision Error Feasibility Trials DPT: Direct Push TechnologyDQO: Data Quality Objectivee.g.: exempli gratia - for example ESA: Environmental Site Assessment ECD: Electron Capture DeviceGC: Gas ChromatographyGC-MS Gas Chromatography – Mass SpectrometryGCTLs: Groundwater Cleanup Target Levels GIS: Geographic Information Systems GPS: Global Positioning Satellite HAZWOPER: Hazardous Waste OperationsHPLC: High Performance Liquid ChromatographyICP: Inductively Coupled PlasmaID: Identification i.e.: id est - that isIUPAC: International Union of Pure and Applied ChemistryLQM: Laboratory Quality Manual MDLs: Method Detection Limits MIP: Membrane Interface Probe mL: MilliliterMTBE: Methyl tert-butyl etherMW: Monitor WellMS: Not ApplicableNELAC: National Environmental Laboratory Accreditation ConferenceNTU: Nephelometric Turbidity Units (turbidity) OSHA: Occupational Safety and Health Administration OVA: Organic Vapor AnalyzerPAHs: Polynuclear Aromatic HydrocarbonsPE: Performance Evaluation P.E.: Professional Engineer P.G.: Professional Geologist ppm: Parts Per MillionPPPM: Pre and Post Phosphate Mining : Radiological Measurement June 2006-

June 2007

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

PQLs: Practical Quantification LimitsQA: Quality AssuranceQAM: Quality Assurance ManualQAPP: Quality Assurance Project PlanQASOP: Quality Assurance and Standard Operating ProceduresQC: Quality ControlQM: Quality Manual%R: Percent RecoveryRCRA: Resource and Conservation Recovery ActRL: Reporting LimitRPD: Relative Percent DifferenceSCTLs: Soil Cleanup Target LevelsSESD: Science and Ecosystem Support Division SPLP: Synthetic Precipitate Leaching Procedures SS: Soil SampleSVOC: Semi-volatile Organic CompoundsSOP: Standard Operating ProcedureSPT Standard Penetration TestTCLP: Toxicity Characteristics Leaching ProcedureTQM: Total Quality ManagementUSC: Unified Soil ClassificationU.S. EPA: United State Environmental Protection AgencyUST: Underground Storage TankVOC: Volatile Organic Compounds

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

USEPA REGION 4 TRIBAL PROGRAM LEVEL QAPP

Note: The checklist explains in detail the actual information that goes into each of the elements to create the Program Level QAPP. To ensure all elements have been properly addressed the QAPP developer/writer’s organization must fill out Appendix A and send in with the Program Level QAPP to the EPA Region 4 Project Officer. You must indicate the page number where each of the elements can be found. This process also helps speed up EPA Region 4’s review and approval of the Program Level QAPP. It also helps the QAPP developer/writer ensure accuracy and the necessary content is in the Program Level QAPP.

Program Level QAPP Title:

Project Location:

Organization:

Name:

Signature and Date:

Tribal Program Level QAPP Elements Page

A-1. Title and Approval Page

Title of QAPP (Including Project Title, EPA #, date, and Revision #)

Organization’s Name: Both the name of the organization preparing the QAPP and the organization conducting the project or the grantee’s name.

Dated Signature of Approving Officials (Names, Titles). Example: Project Manager (Both the originating organization’s PM and EPA’s corresponding PM and/or PO).

Date and Signature of Quality Assurance Manager’s approval for the originating entity and EPA Region 4’s Designated Approving Official (DAO).

Other Key Signatures as needed and/or requested on the project. Example: Consultant hired by Tribe.

A-2. Table of Contents: Including Tables, Figures and Appendices Due to the size and complexity of the Program Level QAPP, please provide a table of contents outlining all appropriate sections and appendices.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

A-3. Distribution List: Name, title/position, organization, and contact information (telephone, mailing & email address), of all entities or agencies requiring copies of the QAPP Should include all major individuals mentioned in the document – important for “controlled” document requirements.

TIP: If Portions are unknown, indicate information will be in project/site-specific QAPP such as Field Team Leader.

A-4. Project/Task Organization

Identifies key project personnel, specifies technical disciplines, details their roles/responsibilities and details the chain of command.

Organization chart provided: Depicts lines of authority, and reporting responsibilities. Organization chart also contains entries for all agencies, contractors and individuals responsible for performing QAPP preparation, sample collection, laboratory analysis, data verification, review and validation, data quality assessment; and project oversight responsibilities. An organization chart is preferred, but a table format is acceptable.

A-5. Problem Definition/Background

Please indicate in the Program Level QAPP that a project’s Problem Definition will be provided in a Project/site-specific QAPP Addendum. (See Appendix B Tribal Project/site-specific QAPP Addendum for information on concepts to be covered in this section).

A-6. Project/Task Description/Timeline

Please indicate in the Program Level QAPP that a project’s Project Description/timeline will be provided in a Project/site-specific QAPP Addendum. (See Appendix B Tribal Project/site-specific QAPP Addendum for information on concepts to be covered in this section.)

A-7. Special Training Requirements and Special Certifications

Identifies how training needs are determined and lists all training requirements for the project. Specifies whether certain professionals require a license or certification to perform duties as required by federal or state laws.

Identifies where training records will be maintained

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Identifies how any new training requirements are communicated to program/upper management.

Discusses the importance of QA training and discusses how this training is provided.

A-8. Documentation and Records

Provides a comprehensive list of the documents and records required for this project (including raw data, field logs, audit reports, QA reports, progress or status reports, analytical data reports, data validation reports/data quality assessments reports).

Specifies the turnaround time for laboratory data deliverables (both hardcopy and electronic formats). Provides hardcopy data package content requirements and electronic data requirements.

Provides the retention time and location of study records, reports and formal documents.

B-1. Sampling Process Design & Site Figures

Please indicate in the Program Level QAPP that a project’s Sampling Process Design and Site Figures will be provided in a project/site-specific QAPP Addendum.

TIP: See Appendix B Tribal Project/site-specific QAPP Addendum for information on concepts to be covered in this section.

B-2. Sampling & Analytical Method Requirements

TIP: In the Program Level QAPP, please provide an example of the Sampling and Analytical Methods Requirements table that will be used in all Project/site-specific QAPP Addenda. The table should be completed with all pre-establish analytical information (i.e., matrix, parameter, container, preservation and holding time information). This table may be used in the future as a template that can be edited for the individual Project/site-specific QAPP Addendum.

Provides the required field sample collection procedures, protocols and methods.

Provides a list of sampling/collection equipment (including make and model of equipment).

Identifies on-site support facilities that are available to field staff.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Identifies key study personnel in charge of or overseeing sampling/collection activities.

Describes equipment decontamination procedures and requirements. Discusses whether sampling equipment is dedicated or non-dedicated.

Provides table listing sample container requirements and preparation requirements for these containers (if provided by laboratory, clearly states such).

Provides table listing sample preservation requirements (for chemical parameters) and holding time criteria (where applicable).

B-3. Sample Handling & Custody Requirements

Provides a detailed description of the procedures for post sample handling (once the sample has been collected).

Provides a detailed description of the chain-of-custody (COC) procedures that will follow in preparing the field samples for transport to laboratory.

TIP: If an FSAP and/or SOP is available, simply reference and include the FSAP and/or SOP in an appendix.

Provide a copy of a COC form, sample label, and custody seal.

B-4. Analytical Methods & Requirements

Clearly identifies the extraction, digestion, analytical methodologies (provides the actual method numbers) to be followed (includes all relevant options or modifications required), identifies the required instrumentation. Provides laboratory SOPs or QAM.

Provides validation criteria for non-standard or unpublished methodologies proposed for use for a given study.

Identifies individual(s) responsible for overseeing the success of the analysis and for implementing corrective actions if deemed necessary.

Specifies the turnaround time for hardcopy and electronic laboratory data deliverables.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

B-5. Field Quality Control Requirements

Design the field QC program that will be routinely performed on Tribal projects, and provide a corresponding field sampling QC table in the QAPP. Break the QC program down by parameter and matrix to identify the appropriate criteria that will be used for the evaluation. The information presented in this table is what will be used in the data evaluation process. Include:

Each type of field QC sample included in the project;Frequency it will be included;Acceptance criteria (control limits) that the data will be compared against;The actions the data evaluator performs when control limits are exceeded.

Typical projects will include field duplicate samples for each matrix and parameter, trip blanks for VOC samples, and temperature blanks for the shipping coolers. However, several other types of field QC samples should be considered for inclusion in the project. These may include equipment blanks, performance evaluation samples (i.e., a certified standard submitted to the laboratory as a blind QC sample), and matrix spike/matrix spike duplicates samples. The environmental professional should weigh several factors in making these decisions, including the project objectives, issues with particular primary contaminants of concern (e.g., the project criteria is near the limits of sensitivity of the method, etc.), and issues associated with certain difficult matrices (i.e., highly organic soil/sediment, brackish water, etc.). When additional field QC is deemed appropriate, the purpose behind including the additional QC samples should be described in the Project/site-specific QAPP addendum, along with the relevant information described above.

TIP: For field duplicate soil samples, document whether they are being collected as collocated duplicates (collected adjacent to each other), or as a split of a single homogenized sample. Collocated duplicate data is useful for evaluating the homogeneity of the soil/sediment matrix within a relative area.

TIP: Individual Tribes may require matrix spike samples as part of the QA/QC requirements that the laboratory needs to follow and report on. These requirements do take precedence and should be reflected in this table.

TIP: Matrix spike samples are being considered as part of the field QC program because they need to be specified on the chain-of-custody and often require the field sampler to collect additional sample volume for the laboratory.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

B-6. Laboratory Quality Control Requirements

Determine the laboratory QC data to be routinely included with the laboratory’s data package, and provide a corresponding laboratory analytical QC table in the QAPP. Break down by parameter and matrix, as appropriate, based on the information provided by the laboratory. The information presented in this table is what will be used in the data evaluation process described in Section D2. Include:

Each type of laboratory QC sample: FrequencyLaboratory acceptance criteria (control limits)The actions the data evaluator performs when control limits are exceeded.

TIP: Typical Tribal projects will include the following laboratory QC results:Organic Analyses: method blanks, surrogate data and lab control samples/lab control sample duplicates (LCS/LCSD)Inorganic analyses: method blanks, lab control samples (LCS).

Note, method blanks and LCSs are QC samples that are brought through the identical extraction and analysis procedures as the field samples.B7. Field Equipment & Corrective Action

TIP: Outline of the field equipment calibration QA/QC information that needs to be provided in a QAPP. If this information is clearly contained in the FSAP and/or SOPs attached to the QAPP, simply reference that appendix in this section of the QAPP. If this information is not clearly contained in the FSAPs and/or SOPs, please provide a field equipment calibration table for the various types of field equipment routinely used on Tribal projects (e.g., PID, individual low flow water quality parameters, etc.).

Document the initial calibration (including standards and concentrations used), and any continuing calibration checks used throughout operation to check for drift (standards, blanks, etc.).Indicate the frequency that each is performed (when and how often).Indicate the acceptance criteria (control limits) that need to be met to proceed.

Discuss the corrective actions taken in the field when the control limits are not met.B8. Lab Equipment & Corrective Action

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

TIP: Below is an outline of the laboratory equipment calibration QA/QC information that needs to be provided in a QAPP. If this information is clearly contained in the laboratory SOPs attached to the QAPP, simply reference that appendix in this section of the QAPP. If this information is not clearly contained in the SOPs, please provide a laboratory equipment calibration table for each analytical method routinely used on Tribal projects. If this information is unknown, specify it will be in the Site-Specific Addendum.Initial calibration (include the number of initial calibration standards and calibration range)Independent calibration check standard (include relevant concentrations)Continuing calibration checks (calibration blanks and concentration of continuing calibration check standard)

TIP: For each calibration step include:Frequency that each is performed.Acceptance criteria (control limits).

Laboratory corrective actions to be taken when control limits are not met.

B-9. Analytical Sensitivity & Project Criteria

Provide an analytical method sensitivity and project criteria table for the analytical methods that will be routinely performed on Tribal projects.

TIP: If data from multiple laboratories is presented, the site-specific project plan will need to clarify which laboratory is being used on the project. (As new methods and/or new laboratories are added on, this table is to be updated accordingly.) If this information is unknown, specify it will be in the Site-Specific Addendum.

This is a very important table for both planning the project and evaluating the resulting data. Initially, the table helps evaluate potential concerns with the sensitivity of an analytical method in relation to the project criteria, particularly for primary contaminants of concern. Finally, the table is critical in understanding the usability of a data point when a sample result is near the project criteria, which is in turn near the quantitation limits and/or detection limits of the method (i.e., is the data point usable, or is more data needed to support a decision or trend in site contamination). The information presented in this table can be used as a handy reference in the data evaluation process. The table is to include:

Laboratory providing the data;Analytical Method reference (e.g., VOCs 8260B);Matrix (soil, groundwater, air, etc);Analyte/compound list;Method Detection Limit (MDL);Quantitation/reporting limit (QL/RL);

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Note: if the laboratory provides only one analytical method limit, note in the table whether it is the MDL or the QL/RL that is being reported. When project criteria are near the MDL, special care should be taken in reviewing this data, particularly if it is a primary contaminant of concern. Depending on the situation, the environmental professional may choose to seek an alternate method with a lower limit of detection. Relevant state/federal criteria or standard that is associated with each analyte/compound and each matrix.

Note: Please make sure the appropriate units are specified and that the analytical method and project criteria units are the same.B-10. Data Management & Documents

TIP: Describe the documentation that will be generated for the project, and the data management procedures that will be used in handling that information. The three basic areas to cover include the field data, laboratory data, and manipulated data presented in final report. Clearly specify what documentation goes into the project file and what documentation will be provided in the final report.

Field Documents and RecordsDiscuss the field documents and records that will be routinely generated, collected, and managed in a Tribal project (e.g., field notes, field screening and analytical data, boring logs, low flow parameters, photographs, etc.). For each:

Describe the process for the collection and organization of the field documents and records, and the relevant data reduction steps that are routinely preformed, including new documents generated based on manipulating the data. (If an SOP for taking daily field notes is not provided in the appendix to the QAPP, please discuss those procedures in this section.)

Describe any QA checks (i.e., for completeness, consistency, accuracy, etc.) that are performed on originally collected data and manipulated data.

Specify what documents and records will be stored in the project file and which will be provided in the final report.

Provide copies of all field forms that will be routinely used in an appendix to the QAPP.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Laboratory Documents and Records

Specify the contents of the routine laboratory data package deliverable that the laboratory is responsible for providing. For a typical Tribal project the following minimum deliverable is recommended:

Project Narrative which contains an explanation of any qualified data, and any observations or deviations encountered during analysis;Data results sheets (including preparation and analysis dates, percent solids for soil/sediment samples, sample concentrations, units, reporting limits, etc.);Laboratory QC package;

When the ability to perform an in-depth evaluation of the data may be desired, or data defensibility is anticipated in the future, the environmental professional should consider requesting complete data packages from the laboratory at the time the work is performed. Although an in-depth evaluation of this data may not be needed, it is important to obtain this data at the time of the project so appropriate completeness checks can be performed while it is current. When full data packages are deemed appropriate, the purpose should be described in the work plan, along with an updated contents list of what is being required in the deliverable. Complete packages generally include all of the above requirements, plus:

All Initial and Continuing Calibration Results and Acceptance Limits;All other method QC data;All sample preparation and analysis raw data (including printouts, chromatograms and laboratory notebook pages)

Post Laboratory Data Manipulation

Describe the routine data entry/manipulation process that takes place in crunching the laboratory data for further evaluation and reporting (i.e., transfer into databases for manipulation into data tables, graphics, forms, etc.). Include relevant QC data that is manipulated for presentation. In addition:

Describe any checks that will be performed to detect and correct errors, and to prevent loss of data during data reduction, data reporting and data entry into forms/reports/databases;Specify what hardcopy and electronic documents and laboratory records will be stored in the project file and what hardcopy and electronic documents and laboratory records will be provided in the final report; Identify applicable software routinely used in data manipulation.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Project FileSpecify where and how long the project file will be maintained and stored, and its final disposition after that period.

C-1. Assessments & Response Actions

Develop and describe the assessment/oversight plan that will be routinely followed with each project, including:

Types of assessments and oversight that will be performed;Frequency (when during the project);Identify the person responsible for performing the assessments/oversight (e.g., field leader, QA officer, etc.), and describe where the results will be documented;Identify who will receive the assessment/oversight report ;Identify who will be responsible for dealing with corrective actions, and follow up on assessments/oversight.

TIP: Since Tribal projects could be relatively short term projects, a typical assessment plan would include 1) oversight of the field team and field subcontractors (early on in the project) by an experienced field leader knowledgeable in the project objectives, and 2) peer review of the final report. Oversight, in this case, essentially means checking on if the project is going according to the plan and procedures in place, helping with problems and questions, and providing a set of eyes keeping the total project in perspective.

Many other areas of a project (including the laboratory) can gain from assessment and oversight, and prime contractors are encouraged to develop and implement a long term approach to this type of a quality assurance program. Please indicate in this section of the QAPP, when additional assessment/oversight is planned for a project. The scope and purpose behind the assessment should be described in the work plan (and include the identified information listed above).

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

C-2. Project Reports

Identify the types of reports that will be routinely provided to during the Tribal project (e.g., status reports, final reports, etc.). Include:

Type of report Frequency of reporting Identify the position(s) of the person(s) who will be responsible for

preparing the reports Identify the organizations who will be receiving the reports

For the final project report, a fairly detailed description of its contents should be provided to establish appropriate expectations between report preparer and client. Please describe primary components of the main body of the document, and specify any routine tables and graphics being provided. Also list the various appendices routinely included in the report (identify items that will be routinely provided in electronic format).

For the format of final project reports, it is preferred that the main body of the report, the summary tables and the graphics all be provided in hardcopy. Appendices that would require large volumes of paper to reproduce (such as the laboratory data package) are preferred in electronic format on a CD. Proper indexing of the CD, for easy review of the information, is recommended and greatly appreciated.

Note, summary data tables of the field sample results should always include the relevant project criteria/standards for easy comparison, and results exceeding criteria should be highlighted in some manner.

Note, in the final report, it is preferred that summary discussion of the tasks performed on the project, and the results for those tasks, not be separated out into two sections. The flow of the review goes much smoother if the summary of each task is followed immediately by the results (i.e., here is how the soil boring program was laid out and here are the results). The combined discussion, along with the tables and maps, helps the reviewer better visualize the layout of contamination on site.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

D-1. Field Data Evaluation

Describe the final data evaluation process that will be routinely performed on the field data (field notes, boring logs, field screening results, and field analytical data, etc.). This evaluation is intended to gather and document important information from the field data that may impact the project, or assist in the interpretation of the laboratory data and the conceptual site model.

TIP: It is important that any observations, trends, conclusions and limitations discovered in reviewing the field data be interpreted and documented in the final report. For each component of the field data evaluation, indicate how the results of the evaluation will be documented, and what will be presented the final report.

Indicate the position(s) of the person(s) who will be performing the field data evaluation.D-2. Laboratory Data Evaluation

Describe the final data evaluation process that will be routinely performed on the laboratory data. Perform a completeness check of the laboratory data package to ensure it is compliant with the requirements in the QAPP. Missing information or questions concerning the data package are to be addressed with the laboratory and any pertinent information should be documented and/or provided in the final report.Review the chain-of-custody, sample preservation and holding time results. Document the presence or absence of any problems with the data, and note any relevant sample data that may be impacted.Evaluate the field QC sample results including data qualifiers for sample results. For the field duplicates sample results, tabulate the relative percent differences (include these results in the final report). If other field QC samples were submitted, such as performance evaluation samples or matrix spike samples, this data should also be tabulated with appropriate recoveries and reported accordingly. Document the presence or absence of any problems or issues and note any relevant sample data that may be impacted, as appropriate. Evaluate the laboratory QC results. Document the presence or absence of any problems or issues and note any relevant sample data that may be impacted. For each of the components of the laboratory data evaluation, indicate how the results of the evaluation will be documented, and what will be presented in the final report. Again, it is important that any observations, trends, and limitations discovered in the field and/or laboratory QC data be interpreted and documented in the final report.

Indicate the position(s) of the person(s) who will be performing the laboratory data evaluation.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

D-3. Data Usability & Project Evaluation

Describe the overall project evaluation process that will be routinely performed to determine the nuances in the usability of the data, update the conceptual site model for the property, and to determine if the objectives of the project have been met.

Tabulate the field sample data together with the state/federal standards for presentation in the final report. Highlight any sample results exceeding criteria. Check the table for correctness and appropriate units.Prepare site figures/maps and other graphical representations, as appropriate, and check for correctness and accuracy.Using the summary tables and graphical presentations, evaluate the usability of the individual field sample results at the parameter level. Document any limitations on how the data should be used and/or interpreted.

The sensitivity criteria. (As sample concentrations approach the reporting limit, and on down to the MDL, the precision and accuracy of the data can be expected to worsen, which can impact how you judge the usability of this data).The results of the field data evaluation.

The results of the laboratory data evaluation.

Some items to look for may include:

Pay attention to contaminants of concern where the concentration is near the project criteria and reporting limits for the method. Are there sufficient surrounding data points to support a trend of real contamination? Or is more data needed to support a conclusion or decision?;

Look at the field duplicate results in evaluating the heterogeneity of a particular matrix. This variability can impact the usability of low level results near the project criteria. Are more data points needed to support a conclusion or decision (i.e., or was it a solo hit just above the criteria);

Look at sample results that are reported at elevated reporting limits due to dilution of the sample during analysis. Is the usability of the data compromised because the reporting limits are greater than the project criteria? Does the laboratory need to be contacted to determine the reason for the dilution? Can cleanup and reanalysis be performed to salvage the data? ;

Look at the low flow groundwater quality data. Does the turbidity data impact the use of the SVOC, PCB or metals data where the concentration is near the project criteria and reporting limits for the method? Etc.

Draft 8-5-13

[Project name][Section Number], [Revision Number]

[Date of version] Page # of ##

Based on the results of the data usability study conducted above, use the summary tables and site maps to perform the overall project evaluation. Document any observations, trends, anomalies, or data gaps that may exist. Evaluate how the sample results have impacted the conceptual site model for the property, and whether the objectives of the project have been met. Draw conclusions and recommendations from all the information obtained above, and document appropriately in the final report.For each of the components of the data usability and project evaluation, indicate how the results of the evaluation will be documented, and what will be presented in the final report. Indicate the position(s) of the person(s) who will be performing the data usability and project evaluation

Draft 8-5-13