Selecting the proper
instrument for a particular type of measurand needs the knowledge of the
performance characteristics of an instrument.
The performance
characteristics of an instrument are mainly divided in two categories.
1. Static
characteristics
2. Dynamic
characteristics
Static characteristics:
The set of criteria
defined for the instruments, which are used to
measure the quantities which are slowly varying with time or mostly
constant, i.e.do not vary with time, is called Static characteristics. The
various Static characteristics are accuracy, precision, resolution, error,
sensitivity, threshold, reproducibility, zero drift, stability, and linearity.
Accuracy
It is the degree of
closeness with which the instrument reading approaches the true value of the
quantity to be measured. It denotes the extent to which we approach the actual
value of the quantity. It indicates the ability of instrument to indicate the
true value of the quantity.
Precision
It is the measure of
consistency or repeatability of measurements. It denotes the closeness with
which individual measurements are departed or distributed about the average of
number of measured values. This confirms the fact that high degree of precision
does not guarantee the accuracy. It is the accurate calibration that makes the
accurate measurement possible.
The precision is composed of
two characteristics.
1. Conformity.
2. Number
of significant figures.
Error
The
most important Static characteristics of an instrument is its accuracy, which
is generally expressed in terms of the error called static error.
The
algebraic difference between the indicated value and the true value of the
quantity to be measured is called an error.
Mathematically it can be expressed as,
|
Where e = Error
Am = Measured value of the quantity
At = True value of the quantity
In this expression, the error denoted as ‘e’ is
also called absolute error .
Sensitivity
The
sensitivity denotes the smallest change in the measured variable to which the
instrument responds. Ti is defined as the ratio of the changes in the output of
an instrument to a change in the value of the quantity to be measured.
Mathematically
it is expressed as
Infinitesimal
change in output

Infinitesimal change in input
![]() |
Sensitivity |
Resolution
It
is the smallest increment of quantity being measured which can be detected with
certainty by an instrument. Thus, the
resolution means the smallest measurable input change.
So
if a non-zero input quantity is slowly increased, output reading will not
increase until some minimum change in the input takes place. This minimum
change which causes the change in the output called resolution. The resolution
of an instrument is also referred to as discrimination of the instrument. The
resolution can affect the accuracy of the measurement.
Threshold
If
the input quantity is slowly varied from zero onwards the output does not change until some minimum value of
the input is exceeded. This minimum value of the input is called threshold.
Thus
the resolution is the smallest measurable input change while the threshold is
the smallest measurable input.
Linearity
The
instrument requires the property of linearity that is the output varies
linearly, according to the input. The linearity is defined as the ability to
reproduce the input characteristics symmetrically and linearly. Graphically
such relationship between input and output is represented by a straight line.
![]() |
Linearity |
The
graph of output against the input is called the calibration curve. The
linearity property indicates the straight line nature of the calibration curve.
The linearity is defined as the maximum
deviation of the actual calibration curve (output) from the idealized straight
line, expressed as a percentage of full scale reading or percentage of the
actual reading.
It
is desirable to have an instrument as linear as possible as the accuracy and
linearity are closely related to each other.
Zero drift
The
drift is the gradual shift of the instrument indication, over an extended
period during which the value of the input variable does not change.
The zero drift is defined as the
deviation in the instrument output with time, from its zero value, when the
variable to be measured is constant. The whole instrument calibration may
gradually shift by the same amount.
0 comments:
Comment Here...