Stability is normally checked to ensure that the measurand(s) did not change during the course of the round. As specified in IS0 13528, the IUPAC International Harmonized Protocol and ISO Guide 35, proficiency test items should be tested under the variety of conditions that occur in the normal operation of a proficiency testing scheme, e.g. conditions of shipping and handling when distributed to participants. The criterion for acceptable instability is the same as the criterion for inhomogeneity in ISO 13528, although typically with fewer tests or measurements

.Annex С

(informative)

Selection and use of proficiency testing

C.1 General

This annex establishes principles for the selection and use of proficiency testing schemes by participants and other interested parties. This annex is also intended to promote the harmonized use of proficiency testing schemes by interested parties (e.g. accreditation bodies, regulatory bodies, or customers of the participant).

Since results from proficiency testing schemes may be used in the evaluation of a participant's performance, it is important that both the interested parties and participants have confidence in the development and operation of the proficiency testing schemes.

It is also important for participants to have a clear understanding of the policies of the interested parties for participation in such proficiency testing schemes, the criteria they use for judging successful performance in proficiency testing schemes, and their policies and procedures for following up any unsatisfactory results from a proficiency test round. However, apart from specific requirements from regulatory bodies, it is the responsibility of the participants themselves to select the appropriate proficiency testing scheme and to evaluate their results correctly.

It should be recognized, however, that interested parties also take into account the suitability of test data produced from activities other than proficiency testing schemes, including, for example, results of participants' own internal quality control procedures with control samples, comparison with split-sample data from other participants and performance on tests of certified reference materials. Therefore, when selecting a proficiency testing scheme, the participant should take into consideration the other quality control activities that are available or have already been performed.

C.2 Selection of proficiency testing schemes

C.2.1 Laboratories (and other types of participants) need to select proficiency testing schemes that are appropriate for their scope of testing or scope of calibration. The proficiency testing schemes selected should comply with the requirements of this International Standard.

C.2.2 In selecting a proficiency testing scheme, the following factors should be considered:

  1. the tests, measurements or calibrations involved should match the types of tests, measurements or calibrations performed by the participant;

  2. the availability to interested parties of details about the scheme design, procedures for establishment of assigned values, instructions to participants, statistical treatment of data, and the final summary report;

  3. the frequency at which the proficiency testing scheme is operated;

  4. the suitability of the organizational logistics for the proficiency testing scheme (e.g. timing, location, sample stability considerations, distribution arrangements) relevant to the group of participants proposed for the proficiency testing scheme;

  5. the suitability of acceptance criteria (i.e. forjudging successful performance in the proficiency test);

  6. the costs;

  7. the proficiency testing provider's policy on maintaining participants' confidentiality;

  8. the timescale for reporting of results and for analysis of performance data;

  9. the characteristics that instil confidence in the suitability of proficiency test items (e.g. homogeneity, stability, and, where appropriate, metrological traceability to national or international standards);

  10. its conformance with this International Standard.

NOTE Some proficiency testing schemes can include tests which are not an exact match for the tests performed by the participant (e.g. the use of a different national standard for the same determination), but it can still be technically justified to participate in the proficiency testing scheme if the treatment of the data allows for consideration of any significant differences in test methodology or other factors.

C.3 Policies on participation in proficiency testing schemes

C.3.1 If relevant, interested parties should document their policies for participation in proficiency testing schemes; such documented policies should be publicly available to laboratories and other interested parties.

C.3.2 Issues which should be addressed in participation policies for specific proficiency testing schemes include:

  1. whether participation in specific proficiency testing schemes is mandatory or voluntary;

  2. the frequency of participation;

  3. the criteria used by the interested party to judge satisfactory or unsatisfactory performance;

  4. whether participants may be required to participate in follow-up proficiency testing schemes if performance is judged to be unsatisfactory;

  5. how the results of proficiency testing will be used in the evaluation of performance and subsequent decisions;

  6. details of the interested party's policy on preserving participants' confidentiality.

C.4 Use of proficiency testing by participants

C.4.1 Participants should draw their own conclusions about their performance from an evaluation of the organization and design of the proficiency testing scheme. Reviews should consider the relation between the proficiency testing scheme and the needs of the participant's customers. The information that should be taken into consideration includes:

  1. the origin and character of proficiency test items;

  2. the test and measurement methods used and, where possible, the assigned values for particular test or measurement methods;

  3. the organization of the proficiency testing scheme (e.g. the statistical design, the number of replicates, the measurands, the manner of execution);

  4. the criteria used by the proficiency testing provider to evaluate the participants' performance;

  5. any relevant regulatory, accreditation or other requirements.

C.4.2 Participants should maintain their own records of performance in proficiency testing, including the outcomes of investigations of any unsatisfactory results and any subsequent corrective or preventive actions.

С.5 Use of results by interested parties

C.5.1 Accreditation bodies

C.5.1.1 The requirements for an accreditation body with regard to use of proficiency testing are specified in ISO/IEC 17011:2004, 7.15.

NOTE Further policies on proficiency testing relevant to the compliance of accreditation bodies with requirements for membership in the ILAC mutual recognition arrangement are specified in ILAC P-9.

C.5.1.2 The results from proficiency testing schemes are useful for both participants and accreditation bodies. There are, however, limitations on the use of such results to determine competence. Successful performance in a specific proficiency testing scheme may represent evidence of competence for that exercise, but may not reflect ongoing competence. Similarly, unsuccessful performance in a specific proficiency testing scheme may reflect a random departure from a participant's normal state of competence. It is for these reasons that proficiency testing should not be the only tool used by accreditation bodies in their accreditation processes.

C.5.1.3 For participants reporting unsatisfactory results, the accreditation bodies should have policies to

  1. ensure that the participants investigate and comment on their performance within an agreed time-frame, and take appropriate corrective action,

  2. (where necessary) ensure that the participants undertake any subsequent proficiency testing to confirm that any corrective actions taken by them are effective, and

  3. (where necessary) ensure that on-site evaluation of the participants is carried out by appropriate technical assessors to confirm that corrective actions are effective.

C.5.1.4 The accreditation bodies should advise their accredited bodies of the possible outcomes of

unsatisfactory performance in a proficiency testing scheme. These may range from continuing accreditation subject to successful attention to corrective actions within agreed time-frames, temporary suspension of accreditation for the relevant tests (subject to corrective action), through to withdrawal of accreditation for the relevant tests.

NOTE Generally speaking, the options selected by an accreditation body will depend on the history of performance of the participant over time and from the most recent on-site assessments.

C.5.1.5 The accreditation bodies should have policies for feedback from accredited bodies relating to

action taken on the basis of results of proficiency testing schemes, particularly for unsatisfactory performance.

C.5.2 Other interested parties

C.5.2.1 Participants may need to demonstrate their competence to other interested parties, such as customers or in a subcontracting mandate. Proficiency testing results, as well as other quality control activities, can be used to demonstrate competence, although this should not be the only activity.

NOTE Proficiency testing data used to validate claims of competence are normally used by organizations in conjunction with other evidence, such as accreditation. See C.5.1.2.

C.5.2.2 It is the responsibility of the participants to ensure that they have provided all the appropriate information to interested parties wishing to evaluate the participants as to their competence.

C.6 Use of proficiency testing by regulatory bodies

C.6.1 The results from proficiency testing schemes are useful for regulatory bodies that need to evaluate the performance of participants covered by regulations.

С.6.2 If the proficiency testing scheme is operated by a regulatory body, it should be operated in

accordance with the requirements of this International Standard.

C.6.3 Regulatory bodies that use independent proficiency testing providers should

  1. seek documentary evidence that the proficiency testing schemes comply with the requirements of this International Standard before recognizing the proficiency testing scheme, and

discuss with participants the scope and operational parameters of the proficiency testing scheme, in order that the participants' performance may be judged adequately in relation to the regulations.Bibliography

  1. ISO/IEC Guide 98-3, Uncertainty of measurement — Part 3: Guide to the expression of uncertainty in measurement (GUM:1995)

  2. ISO/IEC 17011:2004, Conformity assessment— General requirements for accreditation bodies accrediting conformity assessment bodies

  3. ISO/IEC 17025, General requirements for the competence of testing and calibration laboratories

  4. ISO 3534-1, Statistics — Vocabulary and symbols — Part 1: General statistical terms and terms used

in probability

  1. ISO 5725-1, Accuracy (trueness and precision) of measurement methods and results— Parti: General principles and definitions

  2. ISO 5725-2, Accuracy (trueness and precision) of measurement methods and results — Part 2: Basic method for the determination of repeatability and reproducibility of a standard measurement method

  3. ISO 5725-4, Accuracy (trueness and precision) of measurement methods and results — Part 4: Basic methods for the determination of the trueness of a standard measurement method

  4. ISO 13528:2005, Statistical methods for use in proficiency testing by interlaboratory comparisons

  5. ISO 15189, Medical laboratories — Particular requirements for quality and competence

  6. ISO Guide 34, General requirements for the competence of reference material producers

  7. ISO Guide 35, Reference materials — General and statistical principles for certification

  8. ISO/TS 21748, Guide to the use of repeatability, reproducibility and trueness estimates in measurement uncertainty estimation

  9. EN 14136, Use of external quality assessment schemes in the assessment of the performance of in vitro diagnostic examination procedures

  10. ASTM E1301 -95, Standard Guide for Proficiency Testing by Interlaboratory Comparisons

  11. Standards for EQA schemes in laboratory medicine. Version 4.02, December 2004. Clinical Pathology Accreditation (UK) Ltd. Sheffield, UK

  12. National Occupational Standards for External Quality Assessment, HCS-EQA1 to HCS-EQA12. Competence Framework for Healthcare Science, (www.skillsforhealth.orq.uk/)

  13. EURACHEM/CITAC Guide CG4, Quantifying Uncertainty in Analytical Measurement, 2nd edition, 2000

  14. Thompson M., Ellison S.L.R., Wood R., “The International Harmonized Protocol for the proficiency testing of analytical chemistry laboratories” (IUPAC Technical Report), in Pure and Applied Chemistry, Vol. 78, No. 1, pp. 145-196, 2006

  15. ILAC P-9:2005, ILAC Policy for Participation in National and International Proficiency Testing Activities

  16. ILAC P-10:2002, ILAC Policy on Traceability of Measurement Results

1 International Laboratory Accreditation Cooperation.

2 International Union of Pure and Applied Chemistry.