How to calibrate a sensor: methods of calibration and how often do sensors need calibration
Sensor calibration is an adjustment of the sensor to perform as
accurately and consistently as possible. When a sensor needs to provide readings
in standard units – there needs to be a Standard Reference to calibrate the
sensor against. Let us explore this topic deeper.
Calibration is performed on a measurement instrument to confirm its accuracy
and precision, in other words, to verify the dependability of the instrument.
The calibration of measurement tools – sensors is the most important
precondition for the reliability of the values it provides, thus the cornerstone
of quality control.
There are several types, methods or general truths to know of sensor
calibration, depending on the type of sensor being used. Some common
1. Linearity calibration: This type of calibration is used
to measure the linearity of a sensor over its full range of measurement.
2. Span calibration: used to determine the full range of
measurement of a sensor.
3. Zero calibration: used to determine the zero point or
offset of a sensor.
4. Sensitivity calibration: used to determine the
sensitivity of a sensor.
5. Temperature calibration: measure the effect of
temperature on a sensor's performance.
6. Hysteresis calibration: This type of calibration is used
to measure the hysteresis of a sensor, which is the difference in output at a
given input when the input is approached from different directions.
7. Non-linearity calibration: This type of calibration is
used to measure the non-linearity of a sensor over its full range of
8. Repeatability calibration: This type of calibration is
used to measure the repeatability of a sensor, which is the degree to which the
sensor produces the same output for the same input over time.
9. Stray field calibration: This type of calibration is
used to measure the effect of magnetic stray fields on
It is important to note that the type of sensor and its application will
determine which type of calibration is required.
The calibration standards, required time, and investments:
To put it simply – in order to calibrate a sensor – one needs a
reference standard, which is usually another calibrated tool (a sensor,
measurement machine, etc.) which will be used to make reference readings of
comparison. The already-calibrated sensor needs to be accurate (specifically,
more accurate). For this reason there are various standards to be considered
such as the standards of National Institute of Standards and Technology
(NIST) or ones by the International Organization for Standardization
(ISO), which is widely used by many laboratories around the world that have
These, come with the specific references of calibration and the correction
factors that may be necessary, and are each custom aligned by the type of the
sensor such as the ISO9001:2000 Standard for Quality Management
Systems, or the commonly known standard ISO17025:2005 Standard entitled,
“General Requirements for the Competence of Calibration and Testing
Standard physical references are the reasonably accurate physical standards
for some types of sensors. For Rangefinders those are the
Rulers, Meter sticks; for Temperature Sensors: Boiling
Water – 100 °C at sea-level and the triple point of pure water is at
0.01 °C (used to calibrate thermometers); and for
Accelerometers standard physical references are Gravity as it
is a constant 1G on the surface of the earth.
Characteristic Curve is typical for each sensor, and it
showcases the sensor’s response to an input. During the calibration it the
response of the sensor is compared to the available “ideal” response.
Offset is the difference in the output gathered from a
sensor compared to the one from an ideal output (meaning the available best
output): it can be lower or higher. Single point calibration is considered the
easier way to calibrate an offset
Sensitivity/Slope – The difference in sensor output
slope indicates the output change in comparison with the ideal rate. This can be
corrected with two point calibration.
Linearity – In general there are only a few sensors with
completely linear characteristic curves. For some, it is no issue, however some
require more complex calculations to make the output linear.
One Point Calibration requires a single point for calibration, that
can be applied the rest of the way once offset is adjusted. Good examples may be
the temperature sensors, control systems that need to keep the same temperature
for extended periods of time. These sensors are linear, and within certain
measurement ranges have the correct slope.
Two Point Calibration is a bit more complex as it re-scales
a sensor output against two points instead of one. A simple example is calibrating the
temperature sensor through an ice water bath (0.01°C) and boiling water (100°C
at sea-level). Two Point calibration helps to correct offset as well as slope,
and can be used when the output of the sensor is known to be linear (Reference
value is reasonably linear in difference with the actual response, meaning on
the places where it should be higher, it is lawyer and vice versa).
Multi-Point calibration is the method that usually requires
the most time and gives the best results. Occasionally, transducers will have
inconsistency in linearity throughout the range. This can cause errors in a variety of points
through the range. From three to eleven reference points could be used. To
achieve the currently available best accuracy, in some cases curve-fitting is performed.
Recalibration: Why is it necessary?
Even the most widely known and commercially available accurate sensors with
the most sensitive measurement showed some deterioration through wear,
environmental physical damages, etc. Very few sensors come with proper
calibration instructions, even though some standards clearly state the need for
them. For example the DIN EN ISO 9001 specifies the need for calibration of the
measurement equipment, and a certain process for the monitoring of such
equipment should be set in place. A calibration in an accredited laboratory
means that the calibration took place according to the DIN EN ISO/IEC 17025, and
provides clarifications on measurement uncertainties if such shall occur. These
recordings are also necessary for traceability of the calibration.
How frequently a sensor needs calibration depends on the type of the
sensor, sometimes even the certain use case (nature of the application, required
accuracy, the environmental details around the system, etc.). This topic is
important to have under the attention as even some sensors of the same
manufacturer, of the same type could have different stability of measurements
over the time.
clarification of responsible-entities’ competence;
reference standards, traceability and documentation stating these;
measurement result which includes the uncertainty of the measurements;
manuals (documentation) of the conducted procedure;
dates to make sure the proper calibration intervals are documented.
The Client’s Point of View
The above-mentioned has mostly covered the industry basics for the topic, to
ensure the safety regulations connected to the calibration, as well as to
increase the general awareness of the time and effort required to keep a sensor
calibrated, and in a good working condition. At the end of the day, the end user
and a major beneficiary of the sensor is the company that integrates the sensor
into a product. Be it the integration of the sensor into an everyday IoT device
or a traditional sensor in an industrial, manufacturing environment – the
developer wants the most reliable and accurate sensor possible, that matches the
industry standards, and surely is as economical as can be. Usually the R&D
departments are responsible for finding these sensors or the companies that can
custom-develop those for them. However, which is the most accurate
sensor, and how is it calibrated?
As we clarified above – a calibration is conducted with reference readings
from a sensor, a measurement device or generally known standards that is at
least one order more accurate. As the industry standards are aligned with these
devices, it is close to impossible to prove that a sensor is more accurate or
even as accurate as the most accurate known measurement system.
An example of this is one that RVmagnetics has faced over the years
of experience in the sensor manufacturing, R&D and measurement system
RVmagnetics representatives mentioned: “When a client requires proof of
accuracy of 0.01 °C, as for example PT100 sensors, there is a necessity of another, one
order more accurate sensor than our own MicroWires to be able to prove the
0.01 °C accuracy of a MicroWire, which currently does not exist”.
RVmagnetics manufactures exceptionally accurate sensors and custom-develops
measurement solutions based on their own, smallest passive sensor in the world,
for temperature, pressure, position, vibrations and other physical quantities.
A standard calibration process is quite a fixed solution, and the MicroWire
sensor has proven to be more accurate, and sensitive than lots of conventionally
available systems and just as reliable as the standardized certifications can
With a B2B sales & marketing background in INGO & Foreign Investments in government sectors, Tigran is now responsible for extensive industry research in RVmagnetics focused on marketing the company both in R&D and Business spaces. Tigran is up to date with trends in deep tech, sensors, and innovative startups in need of niche growth. He shares the knowledge with RVmagnetics communities via blogs, publications, and news releases, while also using his experience to Manage RVmagnetics' Key Partners' accounts.