CMOS vs. CCD: Which Is Better?
(March 14, 2004)

I happened to come across a quick comparison to CMOS and CCD imagers on How Stuff Works, and the link I will be discussing is CCD vs. CMOS Sensors . I've seen some good information on this site, but this time, it's a little misleading, and I felt I needed to correct this. So, let's take a look at what this article says and break it down...


"CCDs use a special manufacturing process to create the ability to transport charge across the chip without distortion. This process leads to very high-quality sensors in terms of fidelity and light sensitivity..."


Yes...and No. This is a general statement and could be applied to any imager, including CMOS. When we use digital cameras, we rarely see an imager's true quality, because the camera manufacturer's software is interpreting the image data for us, and has its own type of "quality control". By this I mean, the software has built-in noise reduction algorithms, color balancing (more often this is color-bias), saturation, and other algorithms. The closest form of seeing data coming from an imager without being compromised, is taking a picture in RAW format. So, it's not the actual imager which is superior per se, it's how the camera manufacturer's software interprets the image data. If anything, this statement should say the following: The software which interprets the data within the camera, in addition to varying quality of an imager, will be the deteriminating factor of image quality, not the imager exclusively.

"CCD sensors, as mentioned above, create high-quality, low-noise images. CMOS sensors, traditionally, are more susceptible to noise."


Technically, this is correct, but this does not apply to all CMOS sensors. Traditionally, CMOS sensors were susceptible to noise, especially in low light, because each pixel operated on a low voltage. However, Canon has pretty much solved this problem. Canon placed an "amplifier" for each pixel. However, this presented a downside. Since there was no uniformity of amplification from pixel to pixel, there was additional noise. In response to the problem, Canon created an "on-chip noise-removal technology", which allows the sensor to scan signals with low noise (i.e., high S/N ratio), through a built-in circuit that uses a form of Dark Frame Subtraction at the final output stage of the image. What happens is, the circuit takes a sample of the image and the noise, and then takes another sample of just the noise. The two samples are then "subtracted", and what remains is just the final output image signal. This entire process takes place in milliseconds, so you don't even notice it.

Both CMOS and CCD sensors are vulnerable to noise, but at different ends of the power current spectrum. Ordinary CMOS has problems in low-light because of its low voltage traveling across the sensor, but a CCD imager has problems with noise, not because of low voltage, but because of high voltage traveling over its sensor. It's a no-win situation, and this is where a company who creates the interpreting software from that sensor, and who modifies the sensor to their own specifications for their cameras, who make the difference in actual image quality.


"Because each pixel on a CMOS sensor has several transistors located next to it, the light sensitivity of a CMOS chip is lower. Many of the photons hitting the chip hit the transistors instead of the photodiode."


This is why Canon created so-called "amplifiers" (i.e., signal converters which translate a charge into an image signal), on each pixel, to make the pixels more sensitive (as opposed to a CCD imager, which has it's amplifiers located on the side of the imager). As stated above, this brought about a downside such as noise, so they countered this by implementing a special circuit to use noise frame subtractions just before the final output signal stage.

"CMOS sensors traditionally consume little power. Implementing a sensor in CMOS yields a low-power sensor. CCDs, on the other hand, use a process that consumes lots of power. CCDs consume as much as 100 times more power than an equivalent CMOS sensor."


Most of this is correct, but about a CCD consuming as much as 100 times more power...I'm not so sure. I guess this could be possible on a CCD imager I am not aware of, but most current CMOS sensors consume about 1/5 or so the power of a CCD imager. The Foveon X3 chip is CMOS, and it might consume even less power as well. In other words, a CCD is more apt to consume around 5-10x as much power as a CMOS sensor.

"CMOS chips can be fabricated on just about any standard silicon production line, so they tend to be extremely inexpensive compared to CCD sensors.."


This is basically correct. However, CCD manufacturing has also been pretty much mastered, so the differences are not so significant these days. The mere fact of the quantities of digital cameras being released and their low prices, is a testament to this observation.

"CCD sensors have been mass produced for a longer period of time, so they are more mature. They tend to have higher quality pixels, and more of them."


This is basically incorrect. I remember taking Prepositional Logic in college. A dog has a tail. A cat has a tail. Therefore, all dogs are cats. CCD sensors have indeed been produced for a longer period of time, but this has NOTHING to do with the quality of a CMOS sensor.

"Based on these differences, you can see that CCDs tend to be used in cameras that focus on high-quality images with lots of pixels and excellent light sensitivity. CMOS sensors usually have lower quality, lower resolution and lower sensitivity. However, CMOS cameras are less expensive and have great battery life."


Well, need I mention my Prepositional Logic courses I took in college, again? Take the Sony 828 as an example for a camera that has lots of pixels, but has a problem in the image quality department. CMOS sensors are finding their way into higher megapixel cameras, and CCD imagers are not. Take Canon and Nikon for example, the two foremost leaders in imaging technology. The Canon 1Ds and the upcoming 1D MK II, are CMOS. The Nikon D2h is a CMOS hybrid. The yet-to-be-announced D2x, is most likely going to be the same kind of imager.

Final Thoughts

There is really no overall advantage between a standard CMOS and a standard CCD sensor. It is only when we start to factor in the modifications of that sensor, and the software used by the manufacturer to interpret the image signal, when we begin to see the differences (like Canon's CMOS sensors). There was a reason why Canon decided to not have a CCD imager in their 1D MK II. The 1D originally has a 4MP CCD in it, but when we start to get into higher megapixel imagers who have large pixel areas, Canon observed it was best to go with CMOS. Nikon is doing the same. Their LBCAST sensor is a CMOS/CCD hybrid, and if Nikon thought the CCD imagers in their DSLRs would do just fine for upcoming larger megapixel cameras, they would not have decided to make a CMOS hybrid sensor. Finally, it is very hard to intelligently discuss imagers and quality without injecting the specs of the pixel, and how large or small the pixel is. Small pixel areas mean more noise. Period. It's a physics fact. Large pixel areas mean better image quality. This is not to say there will be no problems with cameras that have large pixels, but since less light hits a small pixel, it is without a doubt, a factor in imaging technology and quality.




©2004 DigitalDingus. All rights reserved.