Imagine watching the news and a story entitled “Check your fridge and pantry for food products recalled this week” brings up a picture of your brand’s packaging. It’s not that uncommon. For example, in late 2018 one manufacturer recalled 29,028 pounds of frozen, ready-to-eat poultry and pork sausage links after five people called USDA’s Food Safety and Inspection Service to let the agency know they found metal pieces in their sausage.
While regulatory pressure and the risk of financial loss have pushed many manufacturers to invest in detection technologies, mitigating risk entirely remains a challenge. Thankfully, today’s metal detectors and X-ray detection systems offer higher levels of sensitivity, but that wasn’t always the case, and not every manufacturing facility has the latest technology.
This article will explore the evolution of metal detection technology for food safety. Metal detection systems provide reliable, cost-effective protection from even the smallest metal contaminants found anywhere in a food production process. They can also help improve operational efficiency and eliminate expensive downtime, service costs, and repair bills. And metal detectors are suited for a wide range of demanding food processing applications and packing environments.
Metal Detector Technology
Metal detectors are common across food processing facilities to meet HACCP (Hazard Analysis and Critical Control Point) requirements. Most often they are placed at the end of the line as the last defense against escape before a packaged product is sent on its way to the consumer. The core technology, though, has always had limitations, such as the so-called “product effect,” where a detector cannot differentiate between a conductive product or one with high mineral content and the metal contaminant and susceptibility to “noise” coming from many possible sources in the typical harsh, industrial food production environment.
Basic metal detector technology relies on coils that are wound on a non-metallic frame and connected to a radio frequency transmitter and receiver. The transmitter “excites” any unexpected metal objects and generates very small changes in return signals to detect foreign contaminants. Digital signal processing algorithms are used to differentiate between the expected product signal and that of an unexpected foreign object. The technology works, but historically performance can be inconsistent and sometimes even unpredictable. Recently, with the introduction of multiscan metal detection technology, this is starting to change.
The Evolution of Frequencies
Early metal detection technologies for the food industry were limited to single, fixed frequencies. A manufacturer could best detect a piece of stainless steel using a high frequency, but when a wet, warm, or salty product was introduced it would be forced to reduce the frequency and thus the sensitivity due to the product effect. This simple frequency change required setup by skilled technicians who might spend hours selecting the “best” frequency for detection of all metal types. A user could not make this change themselves.
Single, fixed-frequency metal detectors had limitations for the typical food manufacturing environment given the range of products to be tested and the variability of metal contaminants that could enter the process. That’s why manufacturers started adding second and third frequency choices (always running just one frequency at a time), giving users more flexibility. Manual frequency switching became more common but was only marginally less onerous: Expertise was still needed to optimize detection. Nonetheless, this was an advancement since it introduced more frequency flexibility to metal detection.
The next advancement in metal detection was the development of frequency selection via software. The “best” single frequency for a given application could then be selected prior to production by scanning a product many times and testing detection. This was known as variable frequency metal detection, and it enabled setup without the need for a specialist. Manufacturers still were forced to live with the “best” single frequency compromise, however, and accept its lower overall performance.
ACCESS THE FULL VERSION OF THIS ARTICLE
To view this article and gain unlimited access to premium content on the FQ&S website, register for your FREE account. Build your profile and create a personalized experience today! Sign up is easy!
GET STARTED
Already have an account? LOGIN