University of Mines and Technology Tarkwa: Marking Scheme For Class Assignment
University of Mines and Technology Tarkwa: Marking Scheme For Class Assignment
University of Mines and Technology Tarkwa: Marking Scheme For Class Assignment
TARKWA
Marking Scheme for Class Assignment
1
that are used for comparing the performance of instruments when measuring a quantity,
parameter or condition that rapidly varies or changes with time.
5. Briefly define and explain all the static characteristics of measuring instruments.
I. Accuracy; describes how close a measurement of a quantity, parameter or condition of a
process variable approaches the true value of the process variable.
II. Static Error; is the difference between the measured value and the true value of the
process variable (under static condition).
III. Precision; is the ability of an instrument to reproduce a set of readings within a given
accuracy of a process variable.
IV. Drift; is the change in the indicated reading of an instrument over time when the value of
the measuring quantity remains constant.
V. Sensitivity; is the ratio of change in the output (response) of an instrument to a change
of its input.
VI. Dead Zone; largest range of values of a measured variable to which the instrument does
not respond.
6. How the accuracy of an instrument is usually defined? - The accuracy of an instrument is a
measure of how close the output reading of the instrument is to the correct value.
What is the difference between accuracy and precision?
Accuracy is how close a measurement of a quantity to the true value whiles Precision is the ability
of an instrument to reproduce a set of readings within a given accuracy of a process variable. High
precision does not imply accuracy. A high-precision instrument may have a low accuracy. Low
accuracy measurements from a high-precision instrument are normally caused by a bias in the
measurements, which is removable by recalibration
7. A manganin-wire pressure sensor has a measurement range of 0-20,000 bar and a quoted
inaccuracy of ±1% of full-scale deflection. What is the maximum measurement error when
the instrument is reading a pressure of 15,000 bar?
Solution
The maximum error expected in any measurement reading is 1.0% of the full-scale reading,
which is 20,000 bar for this particular instrument. Hence, the maximum likely error is 1.0% * 20,
000 bar = 200 bar. The maximum measurement error is a constant value related to the full-scale
reading of the instrument, irrespective of the magnitude of the quantity that the instrument is
actually measuring. In this case, as worked out above, the magnitude of the error is 200 bar. Thus,
2
when measuring a pressure of 15000 bar, the maximum possible error of 200 bar is 1.33% of the
measurement value.