Dear Alan Johnson,
Thks yr response.
As you (and the Walrus?) noted, one has to start somewhere.
After a beer and another read of yr previous post, my conclusion to yr interesting (alarming?) list of caveats is that (1) a well-defined test sample/ “neutral” procedure is a necessary starting point for any comparison of (intrinsic) machine capabilities. (2) The “field” performance for relevant matrices will obviously be an equally important practical aspect and probably dominant if other basic specs are similar across brands (are they ??).
The above general approach is fairly typical IMEX for measurement equipment and others (eg disinfectants).
Post#2 of this thread more or less spells out a procedure to (1) above.
My previous referred review link overviews composition aspects also, eg paras (1,4).
Accordingly, I assume, although it is rarely stated, that the caveats are allowable / allowed for in commercial machine specs. Otherwise any published basic sensitivity data would indeed seem to be meaningless.
As far as matrix effects go, I do appreciate yr points that ultimately such may well be crucial. Unfortunately this thread has no specific data on application so one can maybe only comment on the “pure” case for the moment. As discussed in other references posted on this forum, it behooves the user to evaluate (or the seller to supply?) the consequences for their own matrix, for example using dummy procedures as discussed in various other threads on this forum.
So I would like to re-pose one simple but typical query from this thread(s), eg –
For any well-defined materials / procedures, are there any typical quantitative data available regarding relative detector sensitivities for a progression of “stainless steel”, “ferrous”, etc materials. And different brands of detector ? (I appreciate that machine parameters such as aperture size may well restrict available comparisons, even if I don’t exactly know why [ ].)
Hope the above is not too scrambled and i appreciate yr patience. TGISa.
Rgds / Charles.C
Hi Charles, there exists quantitive data for different brands of detectors, both held by the
metal detector manufacturers (in-house bench-marking) and those carried out by the larger food manufacturers when evaluating prospective suppliers, which tend to be product relative. Both sources are pretty much confidential i.e. not in the public domain. Nearly all suppliers of detectors will have internal documents (graphs/databases) that will provide best case sensitivities for ferrous/non-ferrous/stainless steel for given aperture sizes, used for quotation/estimation purposes that may or may not be moderated with real-world data. These also tend not to be in the public domain also, as they can be open to mis-interpretation due to the high degree of variability that can be found in real-world applications. If there is no product effect then as a rule of thumb the detection level for ferrous metals will be the same for non ferrous and stainless steel will be 20 - 30% less detectable. Unfortunately, there is no practical rule of thumb that can be applied where the product/packaging is conductive (or magnetic) due to the magnitude and variability present in some cases. Yes you can apply a general downgrade performance factor that will produce a guideline specification. The safest way to proceed with product that may adversely affect a
metal detector's performance is to get it tested by the prospective manufacturer(s) and obtain a definitive spec. All the major players have in-house test facilities and the ability to produce test reports.
Here is an example of how it can go wrong. Take a ready meal (TV dinner) meat, sauce, rice. Frozen below -18C, no (or virtually no) effect from the product, a supplier quotes according to the above i.e. sensitivity determined by the aperture size alone, Ferrous, Non Ferrous equal and Stainless in proportion. This quoted specification is achieved on start up but the production line is then stopped with packs in transit, the product thaws out on the surface (big effect) and the detector rejects everything until normality is resumed. If this was a frequent occurrence then some form of compensation would need to be applied to the detector in order to accommodate frequent line stoppages, this would (most likely) result in a fall off in sensitivity. Alternatively the speed of transit through the freezer could change resulting in the core of the product not freezing, similar (but less adverse) effect.
Are there significant differences in performance between the major players? In terms of sensitivity I would say not (OK there will be someone out there that will have data saying on product A we found supplier X was better than Y) but if you line them up in a test lab there's not a lot in it. The key differentiators tend to be Robustness, Reliability, Ease of use, Ability to integrate into the production line, Availability of support if things go wrong. You can have the most sensitive machine on the market but it's no use if it dies every time someone waves a wet cloth at it or you need a degree in computer science to set it up. I think it is worth remembering that these
metal detector (performance) test strips are not really representative of real world metallic foreign bodies, which tend to be irregular in shape. A
metal detector set to detect 2.0mm stainless steel for example could quite easily pass a 20mm long contaminant if the shape and orientation was worse case.
Recent improvements in '
metal detector' sensitivity have been incremental and this it likely to be the case in the future, in terms of measurable improvements (especially to stainless steel) X-ray is the technology to look at, but obviously there is cost to consider.
I hope this has been useful, and has answered more questions than it has posed!
Best regards
Alan