next
previous
items

12. QUALITY CONTROL AND ASSURANCE


12.1 Information transfer to Agency

The Topic Centre on Catalogue of Data Sources is currently working on many of these aspects of the environmental information network. For example, there must be a common language for determinants, sampled media and units, usually codified in a data dictionary. In terms of water quality and quantity information there will be a requirement for aggregated data rather than raw data to be transferred to the Agency. The Agency will as well as specifying codes, formats etc. for transfer will have to specify the type of information. For example, monitoring information on water should contain site means, standard errors, confidence limits, maxima, minima and percentiles. In this way the variability and validity of spatial and temporal comparisons can be assessed and quantified. Details of analytical procedures, methods, limits of detection, quality control are also likely to be required. The following sections, therefore, just briefly touch on some of the issues that will be at sometime addressed by the Agency with support from the appropriate Topic Centre(s).


12.2 Data quality control


12.2.1 Handling data

The first stage in ensuring the quality of the collected sampling data is the appropriate choice of storage format. The data should be in a form which allows access to all relevant sample details (such as date and time of sampling, grid reference, etc.), which allows the data to be easily examined for erroneous entries, and which permits the data to be divided into subsets as desired. An ideal storage medium is a database system like Microsoft Access, Borland DBase IV or the Oracle RDBMS.

As well as choosing the data format, a further requirement is that all necessary sampling information is recorded alongside the actual sample value. This is important as once the data has been entered and stored it is likely to be difficult, if not impossible, to add retrospectively the missing information. This information will be needed not only for the purposes of the monitoring scheme, but also to help validate the data.

If the data are produced in a computer readable format at the time of sampling, then the direct transfer of the data onto computer will minimise human errors caused by re-entering the data.

In order to prevent mistakes from being made whilst transferring data from one user to another, a universally agreed data transfer format should be used. One example of a standard format is ASCII files (ordinary text) with comma delimited fields and one sample value, plus other details about the sample, per line. Although this format does not make the most efficient use of space, it allows the data to be read into a new database with little or no manipulation of the transfer files. This avoids errors and saves time and money. An example of how the information might look after importing into a database is given below.


Date

Time

Grid Ref.

Sample Code

Determinant Code

Units

Sample Value

12/3/89

10:59

637098 224573

00102

623

mg/l

0.34

12/3/89

13:59

637098 224573

00103

079

mg/l

2.307

12/3/89

13:59

637098 224573

00103

623

mg/l

0.796

M

M

M


The EU and United Nations have invested considerable resources in providing efficient solutions to the problems associated with data transfer. The UN developed EDIFACT (Electronic Data Interchange For Administration, Commerce and Transport) as a world wide standard. The EC sponsored the application of this message system to environmental data exchange through the TEDIS programme. This system standardises the information format and ensures that all of the required supporting information that is sent with the message. This concept has advantages in that one common interface can be used for transfer of data. Unfortunately, the current system falls short of current EEA requirements as it contains no data dictionary to standardise codes associated with determinant, river sampling sites etc.


12.2.2 Detection of incorrectly entered data

The simplest form of check on entered data is to identify those values which fall outside the expected range. These apparently outlying values can then be verified, changed or discarded as appropriate. It is very important to note that data should only be discarded when there they are definitely known to be incorrect. Outliers which occur due to random variation are valid values and their exclusion at this stage can bias results. Range checking methods are listed below.

  1. For determinant data, one way of identifying possibly wrong samples is to flag all those which are, for example, more than 3 standard deviations from the mean of that determinant (on a logarithmic scale if the data are skewed to the right). The validity of the flagged data should then be checked with the provider or source.
  2. A similar approach is to flag the highest and lowest P% of the data for that determinant (where P% is some suitably small value such as 1%).
  3. Errors are not always confined to the determinant data; dates, grid references etc. are just as likely to be wrong. Detection of these incorrect values will be simple in some cases. For example, dates before or after the start or finish of monitoring must be wrong, grid references not corresponding to water bodies will also be wrong.
  4. In other cases, other variables can be used to make cross checks. For example, dates which are out of synchronisation with sample codes would imply that either the codes or the dates were wrong.

Another method of quality checking is to use a statistical quality assurance scheme, in a similar way to analytical quality control. A number of data records are selected at random (with replacement) and checked for mistakes. The proportion of errors in the database is estimated from the proportion of errors in the randomly selected records, and a confidence interval for the proportion is also estimated. Quality standards are being met if the true proportion of errors is below some prescribed level with a certain level of confidence.

For example, suppose that the proportion of errors must be no more than 1% with 95% confidence. Table 12.1 below shows the one-sided 95% confidence intervals for different numbers of observed errors from 500 randomly selected records (Ellis, 1989).


Table 12.1 One-sided 95% confidence intervals for the true proportion of errors based on 500 randomly checked records

Number of errors

1 sided 95% CI for true proportion of errors

2

[0%, 1.3%]

1

[0%, 0.9%]

0

[0%, 0.6%]


As can be seen from the above table, if more than one error is observed then the quality standards are not being met and remedial action may be necessary. A disadvantage of such a statistical quality control scheme is that it can be expensive to implement.


12.2.3 Analytical limits of detection and missing values

An agreed system of marking sample values below or above analytical limits of detection (LoD) should be used by all parties. The best system is to include an extra field in the database to indicate the state of the sample (for example, the field could contain a minus sign for samples below the LoD, a plus sign for samples above the LoD, and a blank if the sample was normal).

A convenient way of marking a sample as missing is to replace its value with some non-numeric marker, such as an asterisk.


12.3 Analytical performance

The analytical methods described in Appendix C are the techniques commonly used in laboratories routinely analysing these determinants. This does not however, preclude the use of other methods provided that the analytical performance can be proved to be adequate. They are typically generic methods (e.g. ICP-MS, flame photometry etc.), with most of the references being standard methods drawn up by the UK’s Standing Committee of Analysts (SCA). There are of course international organisations such as the European Standardisation Committee (CEN) and the International Standards Organisation (ISO) producing similar standard methods which would be equally relevant.


12.4 Analytical quality control

12.4.1 Background

Analytical Quality Control (AQC) is the term used to describe the procedures adopted to ensure that analytical measurements are of adequate accuracy for their intended purpose. It is worth emphasising that, in any form of monitoring, the aim should not be to seek the ultimate achievable accuracy. The tasks are: (i) to establish sufficient control over measurement errors to allow clear and accurate interpretation; and (ii) to maintain consistency of measurement so that any temporal changes of interest can be discerned.

AQC is the principal practical component of a system of Quality Assurance. Other aspects of Quality Systems (e.g. staff training, instrument maintenance, adequate systems of records) are also important to ensure satisfactory operation of a monitoring programme. For example, it is of little consequence to achieve adequate accuracy, if samples cannot be identified clearly. However, these issues are outside the scope of this section.


12.4.2 Summary of approach to analytical quality control

The following summarises the essential features of Quality Control activities in laboratories undertaking water quality monitoring. The approach is described more fully in the European Standard guidance document "Guide to Analytical Quality Control for Water Analysis" CEN TC230 WG1 TG4, N120.

Laboratories should carry out the following procedures in sequence and obtain satisfactory results before an analytical system is used for routine analysis. The following stages should be observed:

  1. Obtain or derive standards of analytical performance (maximum values for random and systematic error) for the determinants, concentration ranges and sample types of interest. Select an analytical system capable of producing results of the required accuracy for the determinant in question. The analytical method must describe unambiguously and in sufficient detail, the full analytical procedure.
  2. Estimate the within-laboratory total standard deviation of individual results for a range of sample types or matrices and concentrations representative of the samples and sample types of interest.
  3. Estimate spiking recovery achieved using the chosen analytical system for the sample matrix or matrices of interest.
  4. Establish a fully documented, routine AQC system based on quality control charts, as a continuing check on analytical performance when the system is in routine use. Any problems indicated by the routine control system should be investigated immediately and remedial action taken.
  5. As an independent check on analytical performance, laboratories should participate in appropriate external inter-laboratory quality control schemes involving the distribution of check samples. Any evidence from such participation that analytical errors are larger than the acceptable limits should trigger investigation and remedial action.

It is emphasised that the largest part of AQC effort should be expended on (d), above. The participation on inter-laboratory tests is an important supplement to routine within-laboratory quality control, rather than a substitute for it.


12.4.3 Within-laboratory quality control

Routine quality control within a laboratory is based on the use of control charts. The laboratory must analyse a control sample at least once in each batch of analysis. The results of these control analyses are used to plot a control chart which is used to maintain the analytical system in a state of statistical control.

The control sample should be chosen such that it is subject to the same potential sources of error as samples analysed routinely. As a minimum requirement, the control sample should be a solution which contains a known concentration of determinant no greater than the level of interest. Where sample concentrations are greater than the level of interest, then additional control samples should be used to reflect sample concentrations. The type and frequency of use of control materials will depend on the analytical technique and the nature and likely sources of error which may affect results. Normally, between 5% and 20% of all samples analysed should be control samples. All control samples should be subject to the full analytical procedure. The results for all control analyses should be recorded.

Where the limit of detection is critical (e.g. for calculation of contaminant loads), duplicate blank determinations should be made in each routine batch of analyses. The limit of detection should then be re-estimated at 11-batch intervals from these measurements. Reporting limits should be based on the most recent estimate of the limit of detection.

It is essential that the laboratory has adequately documented procedures which define loss of statistical control and specific actions to be taken when an out of control condition arises. Records of breaches of the control rules need to be maintained and, as a minimum, should include:

  • Information to identify the control sample concerned and, via the batch of analysis, the identity of all associated test sample results.
  • Details of the breach of control rules including a record of the control result and the control limits in force at the time.
  • Action taken to investigate the cause of the out of control condition and any consequent conclusions and remedial measures.
  • Action taken with respect to the associated test sample results.

The results of analyses obtained using a system not in statistical control should not be released, except under exceptional circumstances. Any such results should be identifiable for future examination and audit. The circumstances under which such results may be released should be documented clearly and shall include the specification that the cause of the out of control condition must first be identified and shown not to affect results for the analysis of samples.

The control chart should be reviewed periodically and the control limits updated if necessary. The results of all current quality control analyses should be taken into account in calculations of performance and in updating charts, apart from out of control values for which the cause has been identified.

Unless it is agreed otherwise, the laboratory should adhere to the test protocol for an interlaboratory exercise. Samples provided in proficiency testing schemes should be treated as far as is possible in the same way as routine samples with respect to storage, registration, analysis and reporting. Routine AQC procedures should be applied. In particular, any replication of analysis carried out as part of an inter-laboratory test should as far as is possible be 'blind’. Individual replicates need to be submitted for analysis independently and without reference to one another. No more than the specified number of determinations should be made.

Summary of approach to laboratory ACQ

Laboratories should carry out the following procedures in sequence and obtain satisfactory results before any analytical system is used for routine analysis:

  1. Select an analytical system capable of producing results of the required accuracy for the determinant in question. The analytical method must describe unambiguously and in sufficient detail, the full analytical procedure.
  2. Estimate the within-laboratory total standard deviation of individual results for a range of sample types or matrices and concentrations representative of the samples and sample types of interest.
  3. Estimate spiking recovery achieved using the chosen analytical system for the sample matrix or matrices of interest.
  4. Establish a fully documented, routine AQC system based on quality control charts, as a continuing check on analytical performance when the system is in routine use. Any problems indicated by the routine control system must be investigated immediately and remedial action taken.


12.4.4 Inter-laboratory quality control

Laboratories should also participate in suitable external inter-laboratory quality control schemes involving the distribution of check samples. A sample check scheme typically entails the organising laboratory distributing samples of different matrices (e.g. fresh and salt water) and determinants (e.g. metals and organic substances) to participating laboratories. Analysis is undertaken by the participating laboratory and the results are returned to the organising laboratory. This provides a continuous check on the accuracy and comparability of analytical results obtained in the participating laboratories, and identifies the determinants for which improved accuracy is required, towards which each laboratory should assign priority within its own analytical quality control work.


12.4.5 Inter-laboratory quality control

There are examples of national and international quality assurance programmes in some EEA States and as such these could form the basis of assuring at least the quality of chemical data reported to the Agency.

Table 12.2 summarises the national analytical quality control programmes that were reported to be in use in 1992/93 by 12 of the 17 EAA Member States. (ERM, 1993 cited in Groot and Villars, 1995). It can be seen that most countries reported to have some national analytical quality control programme in place.

There may also a need to establish international quality assurance programmes. Such programmes already exist for marine waters for example the QUASIMEME programme which currently supports 90 laboratories in Europe which submit data to international marine monitoring programmes (OSPARCOM, HELCOM, MEDPOL, ICES). Under Article 2 of the Agency Regulation the EEA is required to co-operate with certain organisation such as the Joint Research Centre (JRC) on certain tasks. The JRC runs a sample check, and a reference material production and dissemination programme, AQUACON, and may, therefore, have an over-seeing role in assuring the analytical quality of data submitted to the Agency.


Table 12.2 Summary of analytical quality control measures in some EEA Member States (ERM, 1993 cited in Groot and Villars, 1995)

Country

Analytical Quality Control

Belgium Yes. Includes the use of recovery efficiency, blank samples and analytical standards.
Denmark Yes. Internal AQC includes control charts and inter-laboratory comparisons.
France Yes. Internal AQC with many laboratories formalising formal procedures in Quality Manual.
Germany Yes. Internal AQC protocol including recovery checks, blank tests and use of different analytical methods for confirmation.
Greece No. No formal AQC procedures currently established.
Ireland Yes. Internal AQC protocol including reference standards, spiked samples and extraction efficiency tests.
Italy Yes. Internal AQC including recovery efficiencies, blank samples and analytical standards.
Luxembourg Yes.
Netherlands Yes. Internal AQC protocols including control charts, reference samples for recovery cheeks, blank samples and inter-laboratory comparisons.
Portugal Yes. Internal AQC including control charts and reference standards.
Spain Yes. Internal AQC procedures applied.
UK Yes. Internal AQC including control charts, reference standards, spiked samples, recovery efficiency tests, etc. Also, participate in inter-laboratory checks and all are externally certified.


Permalinks

Document Actions