Monitoring accuracy and precision ? Improvements by introducing robust and resistant statistics (original) (raw)

New advances in method validation and measurement uncertainty aimed at improving the quality of chemical data

Analytical and Bioanalytical Chemistry, 2004

The implementation of quality systems in analytical laboratories has now, in general, been achieved. While this requirement significantly modified the way that the laboratories were run, it has also improved the quality of the results. The key idea is to use analytical procedures which produce results that fulfil the users’ needs and actually help when making decisions. This paper presents the implications of quality systems on the conception and development of an analytical procedure. It introduces the concept of the lifecycle of a method as a model that can be used to organize the selection, development, validation and routine application of a method. It underlines the importance of method validation, and presents a recent approach based on the accuracy profile to illustrate how validation must be fully integrated into the basic design of the method. Thanks to the β-expectation tolerance interval introduced by Mee (Technometrics (1984) 26(3):251–253), it is possible to unambiguously demonstrate the fitness for purpose of a new method. Remembering that it is also a requirement for accredited laboratories to express the measurement uncertainty, the authors show that uncertainty can be easily related to the trueness and precision of the data collected when building the method accuracy profile.

Validation of analytical methods and laboratory procedures for chemical measurements

Arhiv za higijenu rada i toksikologiju, 1998

Method validation is a key element in the establishment of reference methods and in the assessment of a laboratory's competence in producing reliable analytical data. Hence, the scope of the term "method validation" is wide, especially if one bears in mind the role of Quality Assurance/Quality Control (QA/QC). The paper puts validation in the context of the process generating chemical information, introduces basic performance parameters included in the validation processes, and evaluates current approaches to the problem. Two cases are presented in more detail: the development of European standard for chlorophenols and its validation by a full scale collaborative trial and the intralaboratory validation of a method for ethylenethiourea by using alternative analytical techniques.

Quality Control Assessment for Chemical Analysis

International Journal of Science and Research (IJSR), 2016

The current research focused on the importance of quality control for chemical analysis through instrument calibration conditions, on different organochlorine pesticides samples using standard mixture solutions. For this aim statistical quality control calculations were performed to optimize the validation of Gas chromatography mass spectrometry (GC-MS) and its compatibility range. The obtained quality control assessment results confirmed instrument method's performance in term of the instrument accuracy, excellent sensitivity and selectivity of the consequences.

Trends in quality in the analytical laboratory. I. Traceability and measurement uncertainty of analytical results

TrAC Trends in Analytical Chemistry, 2004

Credibility of analytical data has never caught the public's eye more than today. The key principle for quality and reliability of results is comparability between laboratories and on a wider, international basis. In order to be comparable, analytical results must be reported with a statement of measurement uncertainty (MU) and they must be traceable to common primary references. This work focuses on traceability and uncertainty of results. We discuss different approaches to establishing traceability and evaluating MU. We place both concepts in the broader context of analytical method validation and quality assurance. We give up-to-date information in the framework of new, more exacting European and international standards, such as those from Eurachem/CITAC, IUPAC and ISO.

Measurement Performance Assessment of Analytical Chemistry Analysis Methods using Sample Exchange Data

International Journal of Chemistry, 2011

Measurement error modeling is crucial to any assay method. Realistic error models prioritize efforts to reduce key error components and provide a way to estimate total ("random" and "systematic") measurement error variances. This paper uses multi-laboratory data to estimate random error and systematic error variances for seven analytical chemistry destructive assay methods for five analytes (Gallium, Iron, Silicon, Plutonium, and Uranium). Because these variance estimates are based on multiple-component error models, strategies are described for choosing and then fitting error models that allow for lab-to-lab variation.

Measurement uncertainty in analytical methods in which trueness is assessed from recovery assays

Analytica Chimica Acta, 2001

We propose a new procedure for estimating the uncertainty in quantitative routine analysis. This procedure uses the information generated when the trueness of the analytical method is assessed from recovery assays. In this paper, we assess trueness by estimating proportional bias (in terms of recovery) and constant bias separately. The advantage of the procedure is that little extra work needs to be done to estimate the measurement uncertainty associated to routine samples. This uncertainty is considered to be correct whenever the samples used in the recovery assays are representative of the future routine samples (in terms of matrix and analyte concentration). Moreover, these samples should be analysed by varying all the factors that can affect the analytical method. If they are analysed in this fashion, the precision estimates generated in the recovery assays take into account the variability of the routine samples and also all the sources of variability of the analytical method. Other terms related to the sample heterogeneity, sample pretreatments or factors not representatively varied in the recovery assays should only be subsequently included when necessary. The ideas presented are applied to calculate the uncertainty of results obtained when analysing sulphides in wine by HS-SPME-GC.

Design of experiments for the determination of the detection limit in chemical analysis

Analytica Chimica Acta, 1995

The uncertainty of values of the detection limit in chemical analysis is described considering the reasonable risk of false analyte detection and false nondetection, the number of analytical results and some other parameters of the experimental design. Calculations have been made of minimum number of certified reference materials or standard solutions and number of replicates needed to design the experiments for determination of detection limit based on calibration data.

Evaluation of the analytical method performance for incurred samples

Analytica Chimica Acta, 2003

A methodology for the evaluation of the performance of an analytical method for incurred samples is presented. Since this methodology is based on intra-laboratory information, it is suitable for analytical fields that lack reference materials with incurred analytes and it can be used to evaluate the analytical steps prior to the analytical portion, which are usually excluded in proficiency tests or at the certification of reference materials. This methodology can be based on tests performed on routine samples allowing the collection of information on the more relevant combinations analyte/matrix; therefore, this approach is particularly useful for analytical fields that involve a high number of analyte/matrix combinations, which are difficult to cover even considering the frequent participation in expensive proficiency tests.

Chemometric Protocol to Validate an Analytical Method in the Presence of Corrigible Constant and Proportional Systematic Errors

Journal of AOAC INTERNATIONAL, 1997

A statistical methodology to verify the trueness of an analytical method in the presence of corrigible systematic errors is presented. This protocol enables detection of constant and proportional components of error. By using the data set obtained in the Youden calibration with different sample test portions, the constant component of the error (Youden blank) can be determined. An analysis of covariance was applied to 3 calibration curves established with standard solutions and with standard additions to 2 different sample test portions. The slopes were compared, and the presence of any matrix-analyte interaction was detected. A method for removing the numerical components of systematic errors is proposed: a calculation procedure to obtain a correct analytical result and a statistical test to verify the correctness of analyte contents obtained from different calibrations. For demonstration purposes, the protocol was applied to spectrofluorometric determination of oxalates in spinach...

Development of an Automated High-Throughput Analytical Characterization System for Support of a Central Sample Collection Resource

Journal of the Association for Laboratory Automation, 2003

A high-throughput analytical characterization system was developed for quality control support of a central sample collection resource. This system utilizes liquid chromatography mass spectrometry with in-house developed data automation applications. Continuous operation of analytical instrumentation is accomplished by fully automating sample submission and report processing functions. Comprehensive analytical information characteristic of quality, chemical, and physical properties (e.g. relative purity, detection sensitivity, LogD) are automatically transferred to an on-line database. The application of this database for detailed quality assessment of a small sample library (ca. 24,000 compounds) is demonstrated.