Scales follow the same principles as balances, with some additional constraints that rise from the technology used and the size of the instrument. Most scales use strain gauge weighing cells that lead to a lower resolution than balances. In some cases, the rounding error may be predominant, but for scales of higher resolution, the repeatability becomes a decisive contributor to the measurement uncertainty in the lower measurement range of the instrument.
Linearity deviation is often a large contributor, but it is often neglected when weighing small samples. Considering that the relative measurement uncertainty diminishes when weighing larger samples, we can conclude that non-linearity will play a small role in maintaining the measurement uncertainty of the instrument below the required process tolerance. Similarly, we need to focus our attention on repeatability to define the critical limit of a high-resolution industrial scale.
It is important to state that the minimum weight of balances and scales is not constant over time. This is due to changing environmental conditions that affect the performance of the instrument—factors such as vibrations, draft, wear and tear, and temperature changes. The operator also adds variability to the minimum weight, because different users may weigh differently or apply different skills to the instrument.
To ensure that you always operate at a weight above the minimum determined at calibration (at a particular time with particular environmental conditions by a qualified service technician), apply a safety factor. This means you only weigh sufficiently above the minimum weight as determined at calibration. For standard weighing processes, a safety factor of two is commonly used, provided you have reasonably stable environmental conditions and trained operators. For very critical applications or a very unstable environment, an even higher safety factor is recommended.
Another frequent misconception is that the weight of the tare vessel counts toward the minimum weight requirement. In other words, if the tare weighs more than the minimum weight, any quantity of material can be added, and the minimum weight requirement is automatically fulfilled. This suggests that with a large enough tare container, you could weigh a sample of just one gram on an industrial floor scale with a three-ton capacity and still comply with the applicable process accuracy. Given the fact that the rounding error of the digital indication is always the lowest limit of the overall measurement uncertainty, it’s clear that such a small amount of material in any tare container cannot be weighed with accurate results. Although this is an extreme example, it clearly shows that this widespread misinterpretation does not make any sense.
Just recently, we encountered another misconception involving a dispensing application with the measured minimum weight of the scale in question at 100 kg. The company stated that its practice was to dispense 20 kg at a time, always leaving more than 100 kg of substance in the container to adhere to the minimum weight requirement. Its employees did not understand that they would have to dispense at least 100 kg—instead of 20 kg—to comply with their own accuracy requirement.
Routine Testing
“Measuring equipment shall be calibrated and/or verified at specified intervals…against measurement standards traceable to international or national measurement standards.” — ISO9001:2008, 7.6 Control of Monitoring and Measuring Devices.
“The methods and responsibility for the calibration and recalibration of measuring, test and inspection equipment used for monitoring activities outlined in Pre-requisite Program, Food Safety Plans and Food Quality Plans and other process controls…shall be documented and implemented.” — SQF 2000 Guidance – Chapter 6.4.1.1 “Methods & Responsibilities of Calibration of Key Equipment.”
These statements delegate the responsibility for the correct operation of weighing instruments to the user. Statements like these are usually vague; they are intended to be general guidelines. Therefore, they cannot be used for daily operations. Questions such as “How often should I test my weighing instrument?” arise in situations in which guidance is needed to design standard operating procedures. Such guidelines should not be too exhaustive, and thus costly and time consuming, nor too vague, and thus insufficient to assure proper functioning. The right balance between consistent quality and sufficient productivity must be found. The following test procedures for weighing instruments are recommended for normal use:
- Calibration in situ by authorized personnel, including the determination of measurement uncertainty and minimum weight under normal utilization conditions. The aim is to assess the complete performance of the instrument by testing all relevant weighing parameters, made transparent to the user by a calibration certificate. Calibration is an important step to take after the instrument is installed and the necessary functional tests are performed.
- Routine test of the weighing system, to be carried out in situ by the user on weighing parameters that have the greatest influence on the performance of the balance or scale; the aim is to confirm the suitability of the instrument for the application.
- Automatic tests or adjustments, where applicable, using built-in reference weights; the aim is to reduce the effort of manual testing stipulated by specific FDA guidance.4
Test Frequencies
The routine testing procedures and corresponding frequencies are based on:
- The required weighing accuracy of the application;
- The impact of OOS results (e.g., for business, consumer, or environment), in case the weighing instrument does not adhere to the process-specific weighing requirements; and
- The detectability of a malfunction.
The more stringent the accuracy requirements of a weighing process are, the higher the probability is that results will fail to comply. Therefore, test frequency must be increased. Similarly, if the severity of the impact increases, testing should be performed more frequently to offset the likelihood of noncompliance (Figure 2). If malfunction of the weighing instrument is easily detected, test frequency should be decreased. The frequency of testing ranges from daily, for risky applications (user or automatic tests), to weekly, monthly, quarterly, semi-annually, and yearly.
ACCESS THE FULL VERSION OF THIS ARTICLE
To view this article and gain unlimited access to premium content on the FQ&S website, register for your FREE account. Build your profile and create a personalized experience today! Sign up is easy!
GET STARTED
Already have an account? LOGIN