How to choose the resolution of industrial cameras?
Suppose the surface scratch of an object is detected, the size of the object to be photographed is 10 * 8mm, and the required detection accuracy is 0.01mm. First assume that the field of view we are shooting is 12 * 10mm, then the minimum resolution of the camera should be selected: (12 / 0.01) * (10 / 0.01) = 1200 * 1000, a camera with about 1.2 million pixels, that is to say If one pixel corresponds to one defect detection, then the minimum resolution must be no less than 1.2 million pixels, but the 1.3-megapixel camera is common on the market, so generally a 1.3-megapixel camera is used. But the actual problem is that if a pixel corresponds to a defect, then such a system will be extremely unstable, because any random interfering pixel point may be mistaken for a defect. Therefore, in order to improve the accuracy and stability of the system, the most The area of the defect is more than 3 to 4 pixels, so that the camera we choose is 1.3 million by 3 or more, that is, the minimum cannot be less than 3 million pixels, and usually a 3 million pixel camera is the best (I have seen Most people hold sub-pixels and say that they must achieve sub-pixels of a few points, so they do not need such a high-resolution camera. For example, if they say 0.1 pixels, a defect corresponds to 0.1 pixels. The size is calculated by the number of pixels. How do you represent the area of 0.1 pixels? These people use sub-pixels to flicker, which often shows that they have no common sense). In other words, we are only used for measurement, then using a sub-pixel algorithm, a 1.3-megapixel camera can basically meet the needs, but sometimes because of the impact of edge sharpness, when extracting edges, just offset one pixel, Then accuracy is greatly affected. Therefore, if we choose a 3 million camera, we can also allow the extracted edges to deviate by about 3 pixels, which guarantees the accuracy of the measurement.