Calibration

Industrial Thermometer calibration

The calibration of the industrial thermometer cannot be done. All we can do is verify that they are reasonably correct. This is true for mercury in glass strip and bimetallic thermometers.

The simplest way to calibrate the thermometer is to check the reading of the temperature under two conditions, the reading of the temperature at which the ice melts and the boiling point of water. But this method is not traceable.

To overcome this problem calibrators are used, which is used to heat the sensor and the reading is compared with that of the calibrator bath. Thus we can do an accurate calibration of the thermocouple, thermistor and RTDs.

For the calibration of the thermocouple, it can also be verified by simulating the electrical signal produced by the probe and checking the expected reading against the indicated one.

Temperature standards:

The basic standards for all temperature calibration are:

  • The ice point; ice melting in distilled water at a standard pressure of 101 325 Pa. This is 0° C or 32° F.
  • The boiling point of distilled water at a standard pressure of 101 325 Pa. This is 100°C or 212°F.

There are other fixed points for outside this range internationally agreed. This is called the International Practical Temperature Scale (IPTS). The most common industrial standard thermometer is the platinum PT 100 Ω.

Calibration:

As above mentioned, calibrator bath is used for temperature calibration. The electrical heater uses PT 100 for controlling heat.

Image result for temperature bath

This thermometer is fitted into a temperature bath (usually sand filled or solid block). The temperature of the bath is given on a digital readout.

Calibration procedure for sand-filled temperature bath:

The standard temperature is set using a semi-standard mercury in glass thermometer. An ice/water mixture is used as the standard 0°C A layout of the calibration procedure.

  • Place the thermometer which you are testing in the ice water mixture. It should read zero on the scale. If the thermometer is the filled system/Bourdon tube type, adjust the pointer to read zero.
  • Place the thermometer which you are testing in the temperature bath and adjust the set temperature so that you get the maximum indication on the thermometer dial. If the thermometer is of the filled system/Bourdon tube type, adjust the linkage to the maximum indication point.
  • Non adjustable thermometers are OK if they are within ± 4°F or ± 2°C.

Calibration of electronic temperature transmitter:

Electronic transmitters are used for most types of T/C’s and RTD’s. The calibration procedure must be done using the manufacturer’s manual. The basic calibration procedure for all transmitters are same and same test equipments are used.

For two wire or thermocouple type calibration:

The digital voltmeter is connected across the test position. The zero and span are adjusted to give a 4-20mA output signal corresponding to the T/C sensor range.

For LRV the transmitter reading should be 4mA and URV it should be 20mA. Adjust zero or span for correcting.

For three wire or RTD type calibration:

The input RTD resistance value for the temperature comes from a decade box (standard variable resistor). The output 4-20mA is measured by using either the test position or a standard resistor.

The zero is adjusted to give 4mA for the minimum temperature set by the decade box. The span is -adjusted to give 20mA for the maximum temperature set by the decade box.

Modern calibrators produce a resistance output so a decade box is not required.

Sivaranjith

Instrumentation Engineer

Related Articles

Leave a Reply

Back to top button