Schweitzer Fachinformationen
Wenn es um professionelles Wissen geht, ist Schweitzer Fachinformationen wegweisend. Kunden aus Recht und Beratung sowie Unternehmen, öffentliche Verwaltungen und Bibliotheken erhalten komplette Lösungen zum Beschaffen, Verwalten und Nutzen von digitalen und gedruckten Medien.
The general understanding of the term "calibration" is far from what applies to the concept in an analytical sense. Leaving aside colloquial connotations, such as calibrating a weapon, the term is generally associated with the adjustment of specific parameters of an object to fixed or desired quantities, and in particular with the adjustment of a specific instrument to perform a correct function. It is, therefore, understood more as a process of instrumental standardization or adjustment. This is reinforced by publicly available nomenclatural sources. For example, in the Cambridge Advanced Learner's Dictionary [1] calibration is defined as " . the process of checking a measuring instrument to see if it is accurate," and in the http://Vocabulary.com online dictionary as "the act of checking or adjusting (by comparison with a standard) the accuracy of a measuring instrument" [2]. Even in a modern textbook in the field of instrumental analysis, you can read: "In analytical chemistry, calibration is defined as the process of assessment and refinement of the accuracy and precision of a method, and particularly the associated measuring equipment." [3].
The ambiguity of the term "calibration" makes it difficult to understand it properly in a purely analytical sense. To understand the term in this way, one must of course take into account the specificity of chemical analysis.
The analyst aims to receive the analytical result, i.e. to identify the type (in qualitative analysis) or to determine the quantity (in quantitative analysis) of a selected component (analyte) in the material (sample) assayed. To achieve this goal, he must undertake a series of operations that make up the analytical procedure, the general scheme of which is shown in Figure 1.1.
When starting an analysis, the sample must first be prepared for measurement in such a way that its physical and chemical properties are most suitable for measuring the type or amount of analyte in question. This step consists of such processes as, e.g. taking the sample from its natural environment and then changing its aggregate state, diluting it, pre-concentrating it, separating the components, changing the temperature, or causing a chemical reaction.
Figure 1.1 Analytical procedure alone (a) and supplemented by analytical calibration (b).
The measurement is generally performed by the chosen using an instrument that operates on the principle of a selected measurement method (e.g. atomic absorption spectrometry, potentiometry, etc.). The instrument should respond to the presence of the analyte studied in the form of measurement signals. From a calibration point of view, the most relevant signal is the so-called analytical signal, i.e. the signal corresponding to the presence of analyte in the sample.
An analytical procedure carried out in a defined manner by a specific measurement method forms an analytical method.
The basic analytical problem is that the analytical signal is not a direct measure of the type and amount of analyte in the sample, but only information indicating that a certain component in a certain amount is present in the sample. To perform a complete analysis, it is necessary to be able to transform the analytical signal into the analytical result and to perform this transformation. This is the role of analytical calibration. As seen in Figure 1.3, the analytical calibration process is an integral part of the analytical procedure and without analytical calibration, qualitative and quantitative analysis cannot be performed. Realizing this aspect allows one to look at the subject of calibration as a fundamental analytical issue.
However, there is still the question of what the process of transforming an analytical signal to an analytical result consists of, i.e. how analytical calibration should be defined. In this regard, there is also no unified approach, so it is best to rely on official recommendations.
The process of analytical calibration is largely concerned with the making of measurements and the interpretation of measurement data and therefore falls within the scope of metrology. In the Joint Committee for Guides in Metrology (JCGM) document on basic and general terms in metrology, calibration is defined as ". operation that, under specified conditions, in a first step, establishes a relation between the quantity values with measurement uncertainties provided by measurement standards and corresponding indications with associated measurement uncertainties and, in a second step, uses this information to establish a relation for obtaining a measurement result from an indication" [4]. At the same, the document makes it clear that "calibration should not be confused with adjustment of a measuring system .".
The metrological term, although it allows for a deeper understanding of the concept of calibration, is still rather general because it is inherently applicable to different measurement systems and different types of results obtained. The concept of calibration in the analytical sense is more closely approximated by publications issued by the International Union of Pure and Applied Chemistry (IUPAC). In the paper [5], the IUPAC definition is aligned with the JCGM definition in that it defines analytical calibration as "... the set of operations which establish, under specified conditions, the relationship between value indicated by the analytical instrument and the corresponding known values of an analyte," and in a subsequent IUPAC publication [6] we find an express reference of analytical calibration to both quantitative and qualitative calibration: "Calibration in analytical chemistry is the operation that determines the functional relationship between measured values (signal intensities at certain signal positions) and analytical quantities characterizing types of analytes and their amount (content, concentration)."
Such a purely theoretical approach is too general, even abstract, and unrelated to analytical practice. In particular, it does not provide guidance on how the functional relationship (calibration model) should be formulated in different analytical situations and how it relates to the different types of methods used in qualitative and quantitative analysis. Nor does it say anything about the relative nature of the calibration process that the term "measurement standard" gives to the concept in metrological terms.
To extend the definition of analytical calibration, the author proposes to introduce the concept of three functions that relate the signal to the analytical result: the true function, the real function, and the model function [7]. This approach is illustrated in Figure 1.2.
If a sample that an analyst takes for qualitative or quantitative analysis contains a component (analyte) of interest, then before any action is taken with the sample, the type of analyte and its quantity in the sample can be referred to as the true value (type or quantity), xtrue, of the analyte. If it were possible to measure the analytical signal for that analyte at that moment, then the relationship between the resulting signal and its true type or quantity, Ytrue = T(xtrue) could be called the true function.
However, the determination of the true function and the true value of the analyte is not possible in practice because it requires the analyst's intervention in the form of preparing the sample for measurement and performing the measurement. The initiation of even the simplest and shortest analytical steps results in a change of the true analyte concentration in the sample that continues until the analytical signal is measured. Thus, the concepts of true function and true analyte value are essentially unrealistic and impossible to verify experimentally or mathematically.
Figure 1.2 Concept of analytical calibration based on the terms of true, Y = T(x), real, Y = F(x), and model, Y = G(x), functions (virtual analytical steps and terms are denoted by dotted lines; for details see text).
When the sample is prepared for analysis, the type or amount of analyte in the sample to be analyzed takes on a real value, x0. The relationship between the analytical signal and the type or amount of analyte is described at this point by the real function, Y = F(x), which takes the value Y0 for the value x0:
Although the value of Y0 is measurable, the exact form of the real function is unknown because it depends on a number of effects and processes that led to the current state of this relationship during the preparation of the sample for measurement. Consequently, the determination of the real result x0 by means of the real function is impossible.
This situation forces the formulation of an additional auxiliary model function, Y = G(x). The role of this function is to...
Dateiformat: ePUBKopierschutz: Adobe-DRM (Digital Rights Management)
Systemvoraussetzungen:
Das Dateiformat ePUB ist sehr gut für Romane und Sachbücher geeignet – also für „fließenden” Text ohne komplexes Layout. Bei E-Readern oder Smartphones passt sich der Zeilen- und Seitenumbruch automatisch den kleinen Displays an. Mit Adobe-DRM wird hier ein „harter” Kopierschutz verwendet. Wenn die notwendigen Voraussetzungen nicht vorliegen, können Sie das E-Book leider nicht öffnen. Daher müssen Sie bereits vor dem Download Ihre Lese-Hardware vorbereiten.Bitte beachten Sie: Wir empfehlen Ihnen unbedingt nach Installation der Lese-Software diese mit Ihrer persönlichen Adobe-ID zu autorisieren!
Weitere Informationen finden Sie in unserer E-Book Hilfe.