DoD Program Overview:
Presented to:DOE ASP Workshop 2014
Albuquerque, NM
Fred McLeanNavy Laboratory Quality &
Accreditation OfficeChair, DoD Quality Assurance
Oversight Subgroup
Alyssa WingardBMT Designers and Planners
DoD Support
Agenda
Latest EDQW Initiatives DoD ELAP Status Update Latest QSM Questions For Future Consideration
2
EDQW Overview Responsibilities
Recommend sampling, testing and quality systems policy
Develop quality systems guidance documents Provide support on analytical issues to the DoD
Chemical and Material Risk Management Directorate Represent DoD on public and governmental
workgroups Intergovernmental Data Quality Task Force (IDQTF) The NELAC Institute (TNI) Board
Ensure environmental data are of the type and quality necessary to
support their intended uses
3
EDQW Initiatives
Procurement Policy Updates Use of QSM Variances
NELAC/TNI Status Perspective
EPA Method 8330B FAQ/Training session scheduled December 2014 Collaboration with EPA on proposed Method 8330C
Indoor Air HAPSITE Units PFOA/PFOS
Advanced Geophysical Classification4
EDQW Outreach
2015 Environmental Monitoring and Data Quality (EMDQ) Workshop
Location Portland, Oregon
Proposed Dates April 28 – 30, 2015
Call for papers early in fiscal year after anticipated approval
5
DoD-ELAP
6
DoD ELAP Status Update
Current Status of Laboratories and QSM Version 5.0 implementation
DoD ELAP Distribution Accreditation Bodies – Trends DoD ELAP Scopes Oversight and Implementation Issues DENIX Database Frequent Findings Resources
7
DoD ELAPAccreditation Bodies
All are International Laboratory Accreditation
Cooperation (ILAC) Signatories
American Association for Laboratory Accreditation
(A2LA)
ANSI-ASQ National Accreditation Board (ACLASS)
Laboratory Accreditation Bureau (L-A-B)
Perry Johnson Laboratory Accreditation, Inc. (PJLA)
8
Current Status of Labs and QSM Version 5 Implementation
101 Labs currently accredited. Prior to DoD ELAP 44 Laboratories.
All 4 ABs performing assessments to QSM Version 5. LAB and A2LA are only assessing to Version 5.0 PJLA and ACLASS are allowing laboratories flexibility
and can be assessed to Version 4.2 or 5.0
All Labs must be accredited to QSM Version 5 by 1/1/2016.
9
DoD ELAP Distribution Accredited Laboratories
State Labs State Labs State/Province Labs
Alaska 1 Massachusetts 4 Oregon 1
Arkansas 1 Michigan 6 Pennsylvania 3
California 15 Minnesota 1 Rhode Island 2
Colorado 3 Missouri 2 South Carolina 2
Delaware 1 Montana 1 Tennessee 4
Florida 7 Nevada 1 Texas 6
Georgia 2 New Hampshire 2 Utah 2
Illinois 2 New Jersey 2 Vermont 2
Indiana 1 New York 1 Washington 7
Kansas 1 North Carolina 4 Wisconsin 3
Louisiana 2 Ohio 3 Italy 2
Maine 1 Oklahoma 1 Canada 1
Maryland 1
DoD ELAP – AB Distribution
Continuing to see a shift in market Transfers amongst ABs
AB # of Labs – current
# of Labs end of 2013
# of Labs end 2012
A2LA 27 29 29
ACLASS 12 11 14
LAB 28 28 31
PJLA 35 31 28
11
DoD ELAP Scope
Applies to all laboratories regardless of Size Volume of business Field(s) of accreditation Public/private Mobile/fixed CONUS/OCONUS
12
DoD ELAP Scopes
Changes from 2009 to 2014 Scopes are being streamlined “Specialty Analyses”
PFOA/PFOS Chemical Warfare agents 8330B TO-15
Elimination of ASTM Methods Elimination of Standard Methods Labs working with projects to have accreditation
to meet projects needs
13
DoD ELAP Oversight
Witness New Assessors Review assessment reports Witness each AB on-site Participate in ILAC Peer Reviews Database with accreditation expiration
dates Bi-Monthly AB Calls Annual Individual AB Calls
14
DENIX Database
Real time – all labs accredited Database developed by DENIX Information supplied by AB List includes method but not analyte List updated by AB Lab status must be verified on AB
website
15
Common Findings
Laboratory Practice or SOP does not match published method or lab is not following own SOP
Deviations from test methods are not documented and technically justified
Laboratories are not determining LOD/LOQ quarterly
Internal Audits Management Review Not running QC samples at required frequency Records not maintained for equipment and
supplies
16
DoD and DOE Labs
ALS – Salt Lake, Fort Collins American Radiation Services (ARS) BC Laboratories Brooks Rand Eberline Analytical – Oak Ridge GEL Laboratories Shealy Environmental TestAmerica – Knoxville, St. Louis, Denver, Richland
12 Laboratories Total
17
DoD ELAP Resources
DoD ELAP Fact Sheet QSM V5.0 Detection and Quantitation Fact Sheet DoD ELAP & QSM FAQ’s DoD EMDQ Workshop Policy Memos Webinars – 8330B Published on websites:
https://www.denix.osd.mil/portal/page/portal/EDQW www.navylabs.navy.mil
18
DoD/DOE ConsolidatedQSM
19
QSM Update
QSM Overview New Frequently Asked Questions
(FAQs) For Future Consideration
20
DoD / DOE Participants
Fred McLean, Chair – Navy Dr. Chung-Rei Mao – Army Corps. Dr. Seb Gillette – Air Force Joe Pardue – DOE Dr. Todd Hardt - DOE Alyssa Wingard – Navy Alison Felix - Navy Janice Willey – Navy Ed Corl – Navy Dr. Tom Georgian – Army Corps. Dr. Jan Dunker – Army Corps. Dr. Charles Stoner – Army Health Center Dr. Aurelie Soreefan – USAFSAM Christina Rousch – USAFSAM John Schwarz – Army CWA
21
QSM
A “stand-alone” document has been created that requires ISO 17025:2005 & 2009 TNI Standard for full compliance.
Standard is called: DoD/DOE Consolidated Quality Systems Manual for Environmental Laboratories.
Standard has multiple cover pages; a consolidated DoD/DOE cover page; a DoD cover page for QSM Version 5.0; a DOE cover page for QSAS Version 3.0.
DoD and DOE signed out Fourth Quarter 2013.
For DoD, accreditation to QSM Version 5.0 started in January 2014. All Laboratories must be accredited to QSM Version 5.0 by 1/1/2016.
22
Frequently Asked Questions (FAQs)
GENERAL:
The age old problem still exists, that projects do not always clearly define their data quality objectives, and often simply defer to the QSM/QSAS. Projects will request drinking water (524, 525) and waste water (624, 625) methods but not actually be using them for regulatory purposes.
The QSM/QSAS method tables do not apply to drinking water/waste water methods. Regulations (and regulatory methods) and project requirements defined in a QAPP supersede the QSM/QSAS.
23
Frequently Asked Questions (FAQs)
GENERAL:
If an analyte is not available as a PT sample, can the laboratory perform an LCS study (4 replicates) to demonstrate proficiency?
Yes. Module 1, section 2.1.4 states “when PT samples for an analyte-matrix-method cannot be obtained from a PT provider… and is required… other measures as outlined in the appropriate TNI Test Modules must be performed to satisfy the PT requirement.”
24
Frequently Asked Questions (FAQs)
GENERAL:
Can I get a variance to QSM requirements or a contract change? I do not like how you guys handled calibration, what were you thinking?
The EDQW does not issue variances. The requirements for accreditation are considered baseline minimum quality requirements.
25
Frequently Asked Questions (FAQs)
CHEMISTRY:
In Module 4 (Chemistry Testing) sections 1.5.2.1 and 1.5.2.2 why would LODs and LOQs be required for surrogates?
The LOD and LOQ are evaluated for surrogates in order to determine the point at which a surrogate is “diluted out” and therefore not evaluated for percent recovery. While not necessarily important from a laboratory’s point of view, this evaluation can be very important during data evaluation and in evaluating data usability.
.
26
Frequently Asked Questions (FAQs)
CHEMISTRY:
Well, what about surrogate LOD/LOQ for methods that are “post spike” additions and would never be diluted out?
The allowance for reporting surrogates without rigorous DL/LOD studies applies to VOC analyses by purge-and-trap and sorbent tubes only, and:-The measured surrogate concentrations are known to be within the instrument's quantitation ranges, i.e., between the sample LOQ and the high calibration standard.-surrogate ICALs are still a requirement of the QSM.
27
Frequently Asked Questions (FAQs)
Appendix A why do we need to report both LOD and MDL (DL)? Shouldn’t it be one or the other?
Both MDL (DL) and LOD are included largely for risk assessment purposes. However, it is correct for many projects either would be sufficient. In that case project specified reporting directives may be used in lieu of the QSM.
Table -1 Confirmation: would a J flag be used if a second column confirmation was a non-detect (i.e. RPD > 40%) but the primary column was a detect?
If both columns were detects but the secondary column RPD was > 40%, a laboratory would use a J flag. In the stated situation, the results were not confirmed, so a J flag would not be appropriate.
28
Frequently Asked Questions (FAQs)
RADIOCHEMISTRY:
The previous version of the QSAS required tracer and carrier yields determined by indirect measurements to be within 40 - 110%. The current QSM/QSAS specifies 30 - 110%. Was the change intentional, and was there a specific driver for it?
Yes. The change was intentional to align the QSAS with the new QSM/QSAS and make all criteria consistent.
29
Frequently Asked Questions (FAQs)
RADIOCHEMISTRY:
Can a laboratory use a CCV as a LCS for any radiochemical analysis that does not involve any preparation steps?
Yes. For methods that do not involve any preparation steps between a Continuing Calibration Verification (CCV) and a Laboratory Control Sample (LCS), such as Gamma Spectroscopy and Alpha/Beta wipes, a CCV can be used in place of a LCS or vice versa. When one sample serves as both CCV and LCS, the laboratory must use the method-specified acceptance criteria for CCV evaluations AND the laboratory's in-house statistically established control limits for LCS evaluations. The laboratory must also use their CCV for trending purposes if they use the CCV as a LCS.
30
Frequently Asked Questions (FAQs)
RADIOCHEMISTRY: There is discussion about the acceptability of reporting results
when tracer or carrier yields are less than 30% in the main body of the text. However, the tables for Alpha Spectrometry (Table 16), Gas Flow Proportional Counting (Table 18), and Liquid Scintillation Counter Analysis (Table 19) simply list the lower acceptance limits for tracers and carriers as 30%. Are we correct in assuming that in this instance the additional details in the text in the main body can be applied when evaluating the acceptability of carrier and tracer yields, even though those details are not in the tables?
Yes, you are correct. Additional details in the text can be applied. The tables themselves do not expand beyond basic QC requirements.
31
Frequently Asked Questions (FAQs)
RADIOCHEMISTRY:
We noticed that in the radiochemistry tables that MARLAP requirements have been inserted as an option? Can our laboratory use the MARLAP definitions such as MDC instead of MDA?
Yes. In Module 6 (Radiochemistry), section 1.3 the QSM/QSAS provides a clarification that the terms, definitions, and requirements of MARLAP Manual July 2004 can be used.
3232
For future consideration?
Do we need more clarification on when it is appropriate/not appropriate to use “force through zero” calibrations?
What about negative intercepts in general? How do we address negative Continuing Calibration Blanks (CCB) in metals?
Our data validators are seeing CCB in metals being misused (CCB failures not resulting in sample re-analysis). Should CCB failures be treated the same as for CCVs?
Do we need a general formula for calculating the Detection Limit (DL) using the Combined Standard Uncertainty (CSU)?
Should we discuss the use of a background quench curve to assess the proper background to be subtracted for scintillation counters?
Do we need additional tables for Air Methods such as TO-14, TO-15 and Method 1668 (PCB Congeners)? What about multi-increment sampling?
3333
QUESTIONS????
For more information:https://www.denix.osd.mil/portal/page/portal/EDQW
www.navylabs.navy.mil
Fred McLean: [email protected] Wingard: [email protected]
34