Color Image Acquisition
Sam Liebo - Lead Applications Engineer
A common misconception about color image acquisition is that each pixel sees every color (red, green, and blue). This is not the case with standard color sensors. A common technique to give color sensitivity to a black & white image sensor is the application of a color mosaic filter on top of the sensor. This has some negative effects on image quality. All else equal, your effective resolution and edge quality will be lower with a color image when compared to a monochrome image.
Bayer Pattern The most common mosaic filter is a Bayer pattern. With the Bayer pattern, each pixel is covered by a specific color filter, in a specific pattern. Half of the total number of pixels are green (G), while a quarter of the total number is assigned to both red (R) and blue (B).
Each color pixel is composed of three separate color components: red, green and blue. The missing colors, for each pixel, are interpolated using the surrounding pixels at each location. For example, if a pixel is filtered for green, the value for the green component is known, but the values of the red and the blue components must be calculated from the average value of surrounding red-filtered and blue-filtered pixels. Through software interpolation, each pixel is assigned a value from 0 to 255 for the two unknown color components. The following are examples (courtesy of Cognex) of how the values for all three color components are calculated for a single pixel.
In this example, the values of the RGB components for pixel G1 are (80,218,45).
What does this mean in terms of image quality? Here is an example of a Bayer pattern representation of a full-color image.
1) Actual image. 2) Intensity value that each pixel sees – around half the image is just “lost” 3) Color applied to the intensity image. 4) Interpolation results in the final image.
It’s clear to see from this how your effective resolution and edge quality are lower in the final image.
Color Artifacts In addition to an overall lower resolution image, you might experience color artifacts.
Aliasing can occur when a pattern in the image interferes with the Bayer pattern on the sensor. This causes a sampling error that can create false colors.
Zippering is the common name for the edge blurring that occurs in an on/off pattern resulting from the Bayer pattern.
What are the alternatives to using a color camera? In many cases, a monochrome camera and use of colored lighting can function better than a color camera. The color of an object is the color of light it reflects. Typical ambient light is white, which contains all the colors, but if only one color is presented the intensity of the object in question will vary drastically.
If you shine a color of light the same or similar (adjacent on the color wheel) to the subject onto it, it will brighten the object. If you shine a color of the light opposite (non-adjacent on the color wheel), it will be absorbed and therefore not brighten the object.
For example, shining red light on a red part will make it bright. Whereas shining green light on the same red part will make it dark.
Take this for example …
However, if we use colored light, we can highlight one of these colors without the use of a color camera.
If an application requires sorting multiple colors, then a color camera would be beneficial. Otherwise, a monochrome camera with colored lighting will suffice and give you a more resolute image.