A Manufacturer's Perspective on Calibration Intervals
This article is based on an original publication by Fluke Corporation.
Common industry standards dictate that test and measurement equipment should be calibrated regularly to maintain a quality assurance program. This typically translates to intervals of one year, as is most commonly recommended for precision test equipment. While owners of larger quantities of equipment in fields such as military and defense may use more advanced models to manage calibrations, these tools are not generally required.
Users may opt for a more simple method of calibration interval management because advanced models are too complex or too incomplete for their specific needs. They may simply choose to adopt the manufacturer's recommendation.
Manufacturer's Recommended Intervals
While one would assume that if manufacturers were truly analyzing the proper calibration intervals for their products, there would be more variation than just one year. However, it should be noted that the design community has actually been given the one year specification as a target, as variability in performance and stress can make predicting appropriate calibration intervals difficult.
The task of determining this reliability target has challenged the metrology academic community for decades. The resulting models can be used to describe the reliability function with more precision than thought possible, with a goal of estimating the in-tolerance reliability as a function of the calibration interval.
For example, a cost model can improve calibration interval optimization, beginning with the concept of accumulated liability. The effects of an out-of-tolerance condition for calibration standards can deteriorate over time, and longer calibration intervals have a higher consequence cost associated with given standard, as more calibrations have been performed. In some cases, it is advisable for the user to calibrate more often than the cost model suggests, and an insurance model may more accurately describe the lab's situation.
The evolution of complex reliability models used to describe the relationship between calibration interval and reliability are essential to the evolution of metrology. By combining reliability with cost models, further improvements in optimizing calibration levels will result.