an approach to the assessment of the quality of environmental monitoring data

5
An approach to the assessment of the quality of environmental monitoring data Judith Dobson,a Michael Gardner,*b Brian Miller,c Michael Jessepd and Richard Toftd aScottish Environment Protection Agency, Avenue North, Riccarton, Edinburgh, UK EH14 4AP bWRc, Henley Road, Medmenham, Marlow, UK SL7 2HD cScottish Environment Protection Agency, Murray Road, East Kilbride, Glasgow, UK G75 0LA dEnvironment Agency, National Laboratory Service, Edwalton, Nottingham, UK NG2 2HN Received 7th September 1998, Accepted 26th September 1998 This paper reports an approach to the assessment of the validity of environmental monitoring data—a ‘data filter’. The strategy has been developed through the UK National Marine Analytical Quality Control ( AQC ) Scheme for application to data collected for the UK National Marine Monitoring Plan, although the principles described are applicable more widely. The proposed data filter is divided into three components: Part A, ‘QA/QC’—an assessment of the laboratory’s practices in Quality Assurance/Quality Control; Part B, ‘fitness for purpose’—an evaluation of the standard of accuracy that can be demonstrated by activities in (A), in relation to the intended application of the data; and Part C, the overall assessment on which data will be accepted as usable or rejected as being of suspect quality. A pilot application of the proposed approach is reported. The approach described in this paper is intended to formalise the assessment of environmental monitoring data for fitness for a chosen purpose. The issues important to fitness for purpose are discussed and assigned a relative priority order on which to judge the reliability/usefulness of monitoring data. the full spectrum of QC activity; that is, if data from perform- Introduction ance tests, routine QC charts etc., as well as interlaboratory It is accepted that data collected in environmental monitoring tests, were used in the evaluation of the quality and reliability programmes are not necessarily of adequate accuracy or compar- of data. Such an approach, if it were to gain wide acceptance, ability for their intended purpose. This has led to the implemen- would also need to be as objective as possible, based on sound tation of a range of quality control (QC) measures1 designed to principles and clearly defined. This process by which data are ensure that analytical uncertainty is controlled within acceptable categorised as acceptable and fit for their intended purpose— limits. Several monitoring programmes, in the UK2–4 and inter- called here ‘data filtration’—should also be open to scrutiny, nationally,5–8 have incorporated the definition of standards of both when it is applied and subsequently, to allow analytical quality and quality assurance that are consistent with re-evaluation of conclusions or consideration of alternative the overall aims of the monitoring programme. uses for the data. Although the primary function of QC activity is to identify This paper describes current proposals that are intended to and control analytical uncertainty, an important secondary set the assessment of the validity of environmental monitoring role is to provide an objective demonstration that the desired data on a more quantitative basis. These ideas have been level of control has been achieved and maintained. QC infor- developed through the UK National Marine Analytical Quality mation (from sampling and analysis, although this paper Control (AQC ) Scheme3 for application to data collected for concentrates on the latter) can be used in a monitoring the UK National Marine Monitoring Plan (NMP), although programme as evidence that the data and any conclusions we believe that the principles described below are applicable drawn from them are reliable and credible. In the great more widely. majority of monitoring programmes, satisfactory performance in proficiency tests (externally co-ordinated interlaboratory Outline of the proposed data filter tests aimed at monitoring a laboratory’s routine analytical performance) has been seen as a means by which a laboratory’s The proposed data filter is divided into three components: data can be shown to be acceptable. Thus the results of Part A, ‘QA/QC’—an assessment of the laboratory’s practices proficiency tests have been used as the principal source of QC in Quality Assurance/Quality Control; Part B, ‘fitness for information. Whilst satisfactory performance in such tests is purpose’—an evaluation of the standard of accuracy that can an important component in a demonstration of adequate be demonstrated by activities in ( A), in relation to the intended performance, it can prove di cult to assess data quality on application of the data; and Part C, the overall assessment on this basis alone. One of the main problems is that interlabora- which data will be accepted as usable or rejected as being of tory tests are relatively infrequent, because they are expensive suspect quality. The details of the approach are summarised and di cult to organise. This means that decisions concerning in Table 1. the validity of a laboratory’s data for a period of one year might have to be taken on the basis of as few as two or three Part A. QA/QC criteria test results. The validation of a laboratory’s monitoring data could be Part A involves an assessment of: the laboratory’s accreditation status; the degree and detail of testing of analytical systems; set on a firmer footing if use were made of information from J. Environ. Monit., 1999, 1, 91–95 91

Upload: richard

Post on 28-Feb-2017

214 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: An approach to the assessment of the quality of environmental monitoring data

An approach to the assessment of the quality of environmentalmonitoring data

Judith Dobson,a Michael Gardner,*b Brian Miller,c Michael Jessepd and Richard Toftd

aScottish Environment Protection Agency, Avenue North, Riccarton, Edinburgh, UK EH14 4APbWRc, Henley Road, Medmenham, Marlow, UK SL7 2HDcScottish Environment Protection Agency, Murray Road, East Kilbride, Glasgow, UK G75 0LAdEnvironment Agency, National Laboratory Service, Edwalton, Nottingham, UK NG2 2HN

Received 7th September 1998, Accepted 26th September 1998

This paper reports an approach to the assessment of the validity of environmental monitoring data—a ‘data filter’.The strategy has been developed through the UK National Marine Analytical Quality Control (AQC) Scheme forapplication to data collected for the UK National Marine Monitoring Plan, although the principles described areapplicable more widely. The proposed data filter is divided into three components: Part A, ‘QA/QC’—anassessment of the laboratory’s practices in Quality Assurance/Quality Control; Part B, ‘fitness for purpose’—anevaluation of the standard of accuracy that can be demonstrated by activities in (A), in relation to the intendedapplication of the data; and Part C, the overall assessment on which data will be accepted as usable or rejected asbeing of suspect quality. A pilot application of the proposed approach is reported. The approach described in thispaper is intended to formalise the assessment of environmental monitoring data for fitness for a chosen purpose.The issues important to fitness for purpose are discussed and assigned a relative priority order on which to judge thereliability/usefulness of monitoring data.

the full spectrum of QC activity; that is, if data from perform-Introductionance tests, routine QC charts etc., as well as interlaboratory

It is accepted that data collected in environmental monitoring tests, were used in the evaluation of the quality and reliabilityprogrammes are not necessarily of adequate accuracy or compar- of data. Such an approach, if it were to gain wide acceptance,ability for their intended purpose. This has led to the implemen- would also need to be as objective as possible, based on soundtation of a range of quality control (QC) measures1 designed to principles and clearly defined. This process by which data areensure that analytical uncertainty is controlled within acceptable categorised as acceptable and fit for their intended purpose—limits. Several monitoring programmes, in the UK2–4 and inter- called here ‘data filtration’—should also be open to scrutiny,nationally,5–8 have incorporated the definition of standards of both when it is applied and subsequently, to allowanalytical quality and quality assurance that are consistent with re-evaluation of conclusions or consideration of alternativethe overall aims of the monitoring programme. uses for the data.

Although the primary function of QC activity is to identify This paper describes current proposals that are intended toand control analytical uncertainty, an important secondary set the assessment of the validity of environmental monitoringrole is to provide an objective demonstration that the desired data on a more quantitative basis. These ideas have beenlevel of control has been achieved and maintained. QC infor- developed through the UK National Marine Analytical Qualitymation (from sampling and analysis, although this paper Control (AQC) Scheme3 for application to data collected forconcentrates on the latter) can be used in a monitoring the UK National Marine Monitoring Plan (NMP), althoughprogramme as evidence that the data and any conclusions we believe that the principles described below are applicabledrawn from them are reliable and credible. In the great more widely.majority of monitoring programmes, satisfactory performancein proficiency tests (externally co-ordinated interlaboratory Outline of the proposed data filtertests aimed at monitoring a laboratory’s routine analyticalperformance) has been seen as a means by which a laboratory’s The proposed data filter is divided into three components:data can be shown to be acceptable. Thus the results of Part A, ‘QA/QC’—an assessment of the laboratory’s practicesproficiency tests have been used as the principal source of QC in Quality Assurance/Quality Control; Part B, ‘fitness forinformation. Whilst satisfactory performance in such tests is purpose’—an evaluation of the standard of accuracy that canan important component in a demonstration of adequate be demonstrated by activities in (A), in relation to the intendedperformance, it can prove difficult to assess data quality on application of the data; and Part C, the overall assessment onthis basis alone. One of the main problems is that interlabora- which data will be accepted as usable or rejected as being oftory tests are relatively infrequent, because they are expensive suspect quality. The details of the approach are summarisedand difficult to organise. This means that decisions concerning in Table 1.the validity of a laboratory’s data for a period of one yearmight have to be taken on the basis of as few as two or three Part A. QA/QC criteriatest results.

The validation of a laboratory’s monitoring data could be Part A involves an assessment of: the laboratory’s accreditationstatus; the degree and detail of testing of analytical systems;set on a firmer footing if use were made of information from

J. Environ. Monit., 1999, 1, 91–95 91

Publ

ishe

d on

01

Janu

ary

1999

. Dow

nloa

ded

on 2

6/10

/201

4 16

:35:

24.

View Article Online / Journal Homepage / Table of Contents for this issue

Page 2: An approach to the assessment of the quality of environmental monitoring data

Table 1 Data filter criteria

Subject/query Score

A. Laboratory operation, QA/QC. To be asked for each determinand or determinand group

1. QA/accreditation(a) Is accreditation held by the laboratory at all? 2(b) Is accreditation held for the specific tests (determinations) of interest—sample matrix and concentration levels? 5(c) Are accuracy requirements defined for each analytical objective (is the laboratory aware of the purpose of analyses?)? 3

2. Testing of analytical systems(a) Have satisfactory results been obtained using independent reference materials/samples? 4(b) Is limit of detection estimated either from the precision of blank determinations or from the calibration curve? 3(c) Are performance tests carried out across the defined analytical range, i.e. for at least two concentration levels? 4(d) Are within batch and between batch random errors estimated? 4(e) Do estimates of standard deviation have at least 10 degrees of freedom? 5(f ) Is recovery estimated (where practicable) for all relevant sample matrices? 5

3. Routine quality control(a) Are control charts plotted (or electronic equivalent) for the determinands of interest? 7(b) Do the principal control materials contain a matrix relevant to the samples of interest? 5(c) Is the determinand concentration in QC samples either close to that in real samples or near to the quality standard 5

monitored?(d) What action is specified and carried out in case of out of control? 4(e) Is a QC determination included in every batch of analysis? 5(f ) Is more than one type of control analysis carried out for any determinand of interest (e.g. duplicate and spiking 4

recovery)?(g) What is the frequency of points plotted on the control chart?

More than forty per year 10Between 10 and 40 per year 5Average of less than ten per year −5

4. Proficiency tests(a) Does the laboratory take part in a relevant proficiency testing scheme (evidence from at least 2 exercises per year)? 10(b) Is the percentage of results identified as outside National Marine Analytical Quality Control (NMAQC) limits of error:

Less than 15% 1515 to 25% 10More than 25% 2

B. Fitness for purpose criteria Factor

1Is technical approach to sample handling and analysis consistent with the defined determinand of interest?Is limit of detection at least one-tenth of principal level of interest? 1Is estimated total standard deviation consistent with target maximum tolerable standard deviation? 1Has evidence been provided that systematic error is less than maximum tolerable level? 1

Overall 1factor

Overall 100score

routine quality control measures; and performance in pro- Routine QC is the means by which the performance of ananalytical system can be stabilised and maintained throughoutficiency tests. These criteria, suitably weighted, are used to

form the basis of a score which is assigned to each block of its period of use, i.e. the time over which it operates withoutsignificant modification.data for a sample matrix/determinand combination.

The figure against each of the 18 individual questions that The function and importance of routine QC are underpinnedby the fact that any individual laboratory should have carriedmake up the QA/QC section is the score awarded for a positive

response to that question. Where alternative responses are out a set of tests to establish that the performance of theanalytical system meets a set of defined, application-relatedavailable, only one may be chosen.

The authors recognise that the assignment of the relative requirements.14,15 This usually involves tests to establish thatthe short and long term precision of measurement is satisfac-importance of each individual aspect of quality control is to

some extent subjective; others might put a different weight on tory and that systematic error (bias) is adequately controlled.This criterion is allocated a total of 25 out of 100 points.one or another element of QA/QC. The demands of different

data users or monitoring programmes might lead to a different That a laboratory should provide an independent demon-stration of analytical capability is rated equally important withallocation of priorities. However, after due consideration of

the needs of a broad, longterm, national/international monitor- performance testing. Achievement of satisfactory standards inproficiency testing complements the elements of routine QCing programme, we have set the scores as indicated in Table 1.

This is on the basis of our perception of the key issues relating discussed above. Proficiency tests provide an importantelement in the demonstration of analytical capability;16 henceto the generation of analytical data that are both technically

sound and of demonstrable quality. The choice of routine QC our allocation of 25 points out of the total of 100. Theremaining points are set aside for accreditation.5,17for the highest weighted scoring factor reflects our view that

the most important aspect of QA/QC is that there should be Whilst laboratory accreditation does not necessarily implyfitness for purpose for a given set of data, the fact that aa system of routine quality control, and that this should be

based on the establishment and maintenance of statistical laboratory is accredited (even if only for a related analysis) iscompelling evidence that the laboratory has well-defined pro-control9–13 (allocated a maximum of 40 points out of 100).

92 J. Environ. Monit., 1999, 1, 91–95

Publ

ishe

d on

01

Janu

ary

1999

. Dow

nloa

ded

on 2

6/10

/201

4 16

:35:

24.

View Article Online

Page 3: An approach to the assessment of the quality of environmental monitoring data

cedures for laboratory administration and organisation. required targets or (more commonly) that participating labora-Laboratory accreditation can usually be taken as evidence that tories are in the process of implementing new techniques. Insuch procedures are: (a) defined; (b) put into practice; and (c) these cases, the level of acceptability might need to be set at achecked and audited on a regular basis by both the laboratory lower level. If this is done, any relaxation of the acceptableitself and by a national/international accrediting organisation. standard should be performed on the basis that laboratories

struggle to meet the fitness for purpose standard, rather thanbecause the principles of good QC practice have been compro-Part B. Fitness for purposemised. The provisional acceptance of data that do not fully meet

Once the total for QA/QC is established for a given laboratory/ the needs of the end user is justified on the basis that anydeterminand/matrix combination, it is necessary to consider interpretation can be made in the light of the deficiencies whichthe fitness for purpose issues identified in Part B of the data have been identified (and recorded) and that the approach canfilter. This entails examination of the evidence provided, be used to identify areas where improvements are needed.through performance tests and internal and external QC, thatthe laboratory can meet the standard of performance implied

Pilot studyby the intended application of the data. A factor of ‘1’ wouldbe used for laboratories which show that they do meet the In order to test the practicality of the above proposals, theprogramme requirements. Factors of less than 1 would be Co-ordinating Committee of the National Marine AQCallocated if it was felt the laboratory’s evidence of performance Scheme recommended that the data filter should be used on aindicated, say, an inadequate limit of detection or unacceptably limited set of determinands in a sub-set of participatinglarge bias for the sample types of interest. The individual laboratories. The determinands chosen were: ammonia in sea-factors in Part B would be multiplied together to give an water, copper in marine sediments and PCBs in biota.overall factor. The data filter is then operated by multiplying Laboratories were asked to respond to the 18 questions inthe Part A score by an appropriate fitness for purpose factor. Part A of the questionnaire and to provide relevant documen-

For example, suppose a laboratory could demonstrate that, tary evidence to support their responses (to confirm Part A,of the four criteria in Part B of Table 1, its methods of sample see Table 1, and in support of Part B). In the case of thecollection and handling were consistent with the project aims section on accreditation, this evidence comprised the labora-(awarded a score of ‘1’), its limit of detection was satisfactory tory’s Schedule of Accreditation. For performance testing, the(awarded a score of ‘1’), its precision of analysis was not quite results of such tests and any interpretation were requested.adequate—between 1 and 1.5 times the required value for The required evidence on routine quality control comprised astandard deviation—(awarded a score of 0.95), and its control copy of relevant statements on quality control policy, copiesof bias was poor—recovery between 70% and 80%—(awarded of the control charts and the following information for thea score of 0.85). The overall factor would be determinands of interest:1×1×0.95×0.85=0.81. This factor would be applied to a (i) dates for which the chart is applicable (i.e. the first andQA/QC score from Part A of (say) 85 to give an overall score latest batch);of 69 out of 100. The actual factors (of less than 1) assigned (ii) the control sample type (i.e. the sample type/matrix);to different degrees of deficiency would vary with the needs of (iii) the control sample concentration;the programme. Those chosen in this example are illustrations (iv) what values the chart limits are based on;which we envisaged might apply to the programme discussed (v) control rules (i.e. the defined approach to identifyingin the pilot study below. In other cases, the assessment might loss of control and the required response).be different, e.g. where bias is seen as a more serious problem Laboratories were not asked to supply data for their pro-a lower factor would be used. The key issue for openness and ficiency test performance, because such data were alreadyan accountable assessment is that the factors chosen and the

summarised and available to the authors on the NMAQCbasis of the choice should be recorded.database—which stores all relevant proficiency test data forThe separation of laboratory practice and organisation andthe Scheme from 1993 to the present. The timescale coveredthe level of performance achieved (i.e. Parts A and B of theby the trial was specified as July 1996 to June 1997 (withfilter) is seen as critical. A laboratory might be well organisedrespect to QC practice and the proficiency tests assessed).with a sound approach to quality control and a good record

Table 2 gives an example of one laboratory’s questionnairein proficiency tests, yet it might not be operating an analyticalreturn. Figs. 1–3 summarise the responses from all laboratoriessystem which is geared to, or fit for, the purpose in hand.taking part in the pilot study—in terms of their overall scoresThe extent to which routine quality control analyses andand the contributory scores for each of the four main subjectexternal proficiency tests are relevant to the ‘purpose’ (withareas.respect to the types of samples tested and their concentration

A close examination of the questionnaire returns, in relationlevels) is a key link between aspects that are primarily QA andto the accompanying evidence, revealed a small number ofthose which relate more directly to data use.discrepancies. These were largely to do with the interpretationof the questions asked. Consequently, it is desirable thatPart C. Overall assessment participating laboratories should supply the evidence requiredby the data filter and that its interpretation and the allocationThe last stage in the use of the data filter is to review theof scores should be done on an agreed common basis by anoverall scores and to decide what level is acceptable to theexpert panel.programme. This is probably the most subjective part of the

Fitness for purpose criteria in Part B of the data filter wereprocess. For most well-controlled analyses, this threshold ofexamined once the information in Part A of the questionnaireacceptability is likely to be high—say a score of over 70 outwas available. The fitness for purpose criteria are used toof 100. This is on the basis that the data filter has beenmultiply the QA/QC scores to provide an overall score whichdesigned not to require anything greatly in excess of basicexpresses the accuracy and reliability of the data for a particu-good practice; hence a score substantially less than 100 wouldlar monitoring programme of a specified use. A fitness forindicate some serious deficiency.purpose factor of ‘1’ is assigned for a wholly satisfactoryHowever, for some determinand/matrix combinations, it hasresponse.proved difficult to meet the NMP required standard of perform-

For the purpose of the pilot study and given that theance, on a routine basis. This might be due to the fact thatcurrently available analytical techniques cannot meet the laboratories concerned had set up systems expressly to meet

J. Environ. Monit., 1999, 1, 91–95 93

Publ

ishe

d on

01

Janu

ary

1999

. Dow

nloa

ded

on 2

6/10

/201

4 16

:35:

24.

View Article Online

Page 4: An approach to the assessment of the quality of environmental monitoring data

Table 2 Example of laboratory questionnaire return (RM, reference material; LOD, limit of detection; sw, random error within batch; sb,random error between batches; df, degree of freedom)

Ammonia Cu sed. PCB

Lab. B Yes/no Score Sub-tot. Yes/no Score Sub-tot. Yes/no Score Sub-tot.

Accreditation(a) y 2 y 2 y 2(b) y 5 y 5 y 5(c) y 3 y 3 y 3

10 10 10Testing systems(a) RMs n 0 y 4 y 4(b) LOD y 3 y 3 y 3(c) Range n 0 n 0 y 4(d) sw/sb y 4 y 4 y 4(e) df n 0 n 0 n 0(f ) Recovery y 5 y 5 y 5

12 16 20Routine QC(a) Charts y 7 y 7 y 7(b) Matrix y 5 y 5 y 5(c) Range y 5 y 5 n 0(d) Action rules y 4 y 4 y 4(e) Every batch y 5 y 5 y 5(f ) Types n 0 y 4 y 4(g) Frequency

>40 y 10 0 015–40 0 y 5 0<15 0 0 y −5

36 35 20Proficiency tests(a) Take part y 10 y 10 y 10(b) Pass

<15% 0 0 015 to 25% 0 y 10 0>25% y 2 0 y 2

12 20 12Total 58 Total 61 Total 50

����

�����

������

�������

�������

����������

��

100

80

60

40

20

0Lab A

AccreditationTesting systemsRoutine QC

Ammonia in sea-water

Proficiency testsTotal

Lab B Lab C Lab D Lab E Lab F Lab G

Sco

re (

%)

����

������

������

���������

������

��������

���

�������

��

��

100

80

60

40

20

0Lab A

AccreditationTesting systemsRoutine QC

Copper in sediment

Proficiency testsTotal

Lab B Lab C Lab D Lab E Lab F Lab G

Sco

re (

%)

Fig. 2 Relative assessment of the quality of laboratory data: copperFig. 1 Relative assessment of the quality of laboratory data: ammoniain sea-water. in marine sediment.

is a long history of satisfactory performance and good compar-the needs of NMP monitoring, we assigned factors of 1 to alllaboratories. However, a wider implementation of the data ability amongst results from different laboratories—the picture

provided by proficiency tests is similar to that of the overallfilter would require a more detailed review of these issues.assessment. Nevertheless, taking Laboratories E and F asexamples, both have similar records of proficiency test success,Discussionbut the overall assessment of the former is noticeably worsethan that of the latter, largely as a result of the higher scoreIt is interesting to examine the difference between an assess-

ment based on the results of proficiency tests alone and one of Laboratory F in respect of its performance testing work.This higher overall score reflects the fact that Laboratory Fderived using the data filter criteria. For the determination of

copper in sediments (Fig. 2)—a measurement for which there (although its analytical performance, as illustrated by pro-

94 J. Environ. Monit., 1999, 1, 91–95

Publ

ishe

d on

01

Janu

ary

1999

. Dow

nloa

ded

on 2

6/10

/201

4 16

:35:

24.

View Article Online

Page 5: An approach to the assessment of the quality of environmental monitoring data

data filter on an annual basis will serve to illustrate wheresuch improvements have been achieved.

Conclusions and recommendationsThe assessment of the usefulness of a laboratory’s data on thebasis of proficiency tests alone is unlikely to prove reliable.An assessment based on a wider range of QC measures shouldprove more stable and reliable in the long term.

The data filter provides the basis for an objective set ofcriteria for examining the validity of environmental monitoringdata for a defined application. It includes aspects of qualityassurance, performance testing and routine quality control, aswell as more commonly used criteria based on proficiencytests. The pilot study has indicated that the approach ispracticable and has drawn attention to the following points.

The use of a questionnaire alone is insufficient. It is necessaryto implement the approach by the collection of evidence andan assessment by an expert panel. The panel would be given

���������������������������������

��

��

100

80

60

40

20

0Lab A

AccreditationTesting systemsRoutine QC

PCBs in biota

Proficiency testsTotal

Lab B Lab C Lab D Lab E Lab F Lab G

Sco

re (

%)

the responsibility of evaluating the evidence supplied in relationFig. 3 Relative assessment of the quality of laboratory data: PCBs inmarine biota. to the data filter criteria. This will help to ensure that labora-

tories are reviewed on a common basis.All decisions on interpretation should be clearly documented

ficiency tests, is probably equivalent to that of Laboratory E) so that the process can be reviewed if it is decided to reassesshas established a better demonstration of its capabilities. data or to consider the suitability of data for an alternative

For less well-controlled determinands, the achieved standard purpose.of performance is both wider in range and, in some cases,lower than desired. This produces the picture that is evident

Referencesfor the determination of ammonia in water and PCBs in biota(Figs. 1 and 3). An assessment based on the results of pro- 1 J. K. Taylor, Anal. Chem., 1981, 53, 1588a.ficiency tests is markedly different from the overall evaluation 2 E. A. Simpson, J. Inst. Wat. Eng. Sci., 1978, 32, 45.given by the data filter. For example, for PCBs in biota, 3 E. M. Dixon and M. J. Gardner, Science of the Total Environment,

1998, 216, 113.Laboratories B and D achieve a moderately good overall4 J. E. Dobson, M. J. Gardner, A. H. Griffiths, M. A. Jessep andassessment of over 60%. Both of these laboratories show

J. E. Ravenscroft, Accreditation and Quality Assurance, 1997, 2,relatively poor proficiency test scores, but they have relatively294.

high contributions for routine QC and performance testing. 5 J. G. Timmerman, M. J. Gardner and J. E. Ravenscroft, UN/ECE,On the other hand, for ammonia, one of the better proficiency Task Force on Monitoring and Assessment, Volume 4—Qualitytest performers (Laboratory G) scores an unacceptably low Assurance, ISBN 9036945860, RIZA, Lelystad, The Netherlands,

1996.overall value of 40% because of serious deficiencies in its6 International Council for the Exploration of the Seas (ICES),routine QC programme. This suggests that an assessment

Report of the ICES Advisory Committee on the Marinebased on proficiency testing alone might be misleading andEnvironment. ICES Co-operative Research Report No 217, ICES,that a more comprehensive evaluation of a laboratory’s Copenhagen, 1996, pp. 138–147.

approach to quality is preferable. 7 L. D. Mee, M. Horvat and J. P. Villeneuve, Data Quality ReviewThe proposed approach can also offer several subsidiary for MEDPOL : Nineteen Years of Progress. MMP Technical

Report Series No 81, Mediterranean Action Plan or Pollutionbenefits. Firstly, the graphical presentation of the assessment(MEDPOL), Athens, 1994.(as in Figs. 1–3) provides a clear illustration, to the data user,

8 D. E. Wells, A. Aminot, J. de Boer, W. Cofino, D. Kirkwood andof the relative reliability of data for different determinandsB. Pedersen, Mar. Poll. Bull., 1997, 35, 3.and sample matrices. This information can be useful when the 9 M. J. Gardner, The Use of Quality Control Charts in Water

data are interpreted and, for example, when pollution control Analysis. WRc Report CO4239, WRc, Medmenham, UK, 1996.priorities are assigned or future monitoring strategies are 10 R. J. Howarth, Analyst, 1995, 120, 1851.

11 M. Thompson and R. Wood, Pure Appl. Chem., 1995, 67, 649.defined. This, in turn, can help to direct attention and resources12 Analytical Methods Committee, Analyst, 1995, 120, 29.towards steps intended to secure improvements in perform-13 Analytical Methods Committee, Analyst, 1989, 114, 1497.ance, or to strengthen any areas where performance is weak.14 R. V. Cheeseman, M. J. Gardner and A. L. Wilson, A Manual onSuch measures might include the organisation of training Analytical Quality Control for the Water Industry. WRc Report

seminars or the provision of advice by more experienced NS30, ISBN 0902156853, WRc, Medmenham, UK, 1989.practitioners to those engaged in setting up analytical systems. 15 CEN, Guide to Analytical Quality Control for Water Analysis,

Draft ENV CEN TC230 WG1, 1995.Secondly, the fact that the important elements of QA/QC16 M. J. Gardner, J. E. Dobson, A. H. Griffiths, M. A. Jessep andhave been defined, coupled with a clear statement of the

J. E. Ravenscroft, Mar. Poll. Bull., 1997, 35(6), 125.required analytical performance, makes it possible for individ-17 European Standard EN45001. General Criteria for the Operation ofual laboratories to examine their own quality scores, in relation Testing Laboratories, CEN, Brussels, 1989.

to those of their peers. This will show where weaknesses lieand where improvements can be made. The application of the Paper 8/06978F

J. Environ. Monit., 1999, 1, 91–95 95

Publ

ishe

d on

01

Janu

ary

1999

. Dow

nloa

ded

on 2

6/10

/201

4 16

:35:

24.

View Article Online