33

It is widely observed that the blue channel in digital cameras is the noisiest. I've certainly noticed that with my camera. Why is this?

Is it an artifact of some particular technology (e.g. Bayer array or CMOS sensors), or is something to do with the physics of higher-frequency light, or is it related to human vision?

Followup question: Why are sensors less sensitive to blue light?

mattdm
  • 143,140
  • 52
  • 417
  • 741
  • 1
    You might fnd this of interest: http://micro.magnet.fsu.edu/primer/digitalimaging/concepts/quantumefficiency.html (nutshell answer is less sensitive to blue). Too much tech in it for weekend light reading for me. ;) – Joanne C Apr 10 '11 at 16:12
  • 2
    I find it ironic and rather amusing that Matt's own question shows up second in the search link in his own post. ;D – jrista Apr 10 '11 at 18:30
  • @jrista - ha ha, that's hilarious! – Joanne C Apr 11 '11 at 02:54
  • I think it means the site is working. :) – mattdm Apr 11 '11 at 02:56
  • @Tall Jeff's answer below is a great start (as is @coneslayer's shorter comment), but I don't feel like it addresses the general question (now expanded above); I've added a bounty in hopes of getting more general and authoritative answers. Thanks. – mattdm Apr 13 '11 at 13:06
  • Backside-illuminated sensors will see an improved blue response. Much of the blue light is attenuated as it passes through the non-sensitive gate structure of a frontside-illuminated sensor. See typical spectral response curve (for research-grade CCDs) at http://www.ccd.com/ccd101.html – coneslayer Apr 13 '11 at 13:16
  • @coneslayer — you can edit your answer below to include that.... – mattdm Apr 13 '11 at 13:28
  • @mattdm - Relative to your follow-up refinement, this current state of the art is really most a matter of cost/performance optimization. There is nothing inherent to physics that requires blue performance to be worse, only that it would be MUCH MUCH more expensive given current device constructions and given that the human eye is not very sensitive on the blue/yellow color axis we're already doing "good enough". In fact, I'm sure most camera makers would prefer total cost to drop before paying the same or more just to improve blue noise for almost all common applications. – Tall Jeff Apr 13 '11 at 23:21
  • @Tall Jeff — you can edit your answer below to improve it by adding information like that. Thanks! – mattdm Apr 13 '11 at 23:49
  • @mattdm - Done & thanks for your active participation on this site!! – Tall Jeff Apr 14 '11 at 01:37
  • JoanneC: thanks, that link is super-informative. Do you think you could summarize some of it into an answer? Also, how well does it generalize from scientific instruments to photography market gear, and from CCD to CMOS? – mattdm Apr 23 '11 at 03:30

4 Answers4

32

Given the current state of the art, the noise in the blue channel is a combination of cascading effects that work together to make the blue "look" the worst. First, with the Bayer pattern setup, there are twice as many green pixels as red or blue ones in the matrix*. This immediately puts the blue and red at a spacial disadvantage as compared to the green channel and results in much more spectral noise for those two channels when the RGB triplets are reconstructed from adjacent sensor pixels. For example, a 10M pixel sensor is going to have 5M source green pixels, 2.5M red ones and 2.5M blue ones. Clearly, when you form that raw information into the final 10M RGB triplets, it's clear that there can be no better than 1/2 as much information for red or blue channel and this appears as a form of noise in the final image.

The next effect has to do with the spectral sensitivities of the sensor system through the Red, Green and Blue filters. As a system, modern CMOS sensors are about 50% more sensitive to the Green and Red areas of the spectrum than they are to the blue areas. For example, for this CMOS sensor from Cypress, we can see on Page 3 that the relative sensitivities are about Red (75%), Green (80%), Blue (50%) when you index the curves at the right wavelengths for each color. This lack of sensitivity combined with a fixed level of sensor and sampling noise for all pixels across the sensors put blue at a significant signal to noise ratio disadvantage as compared to the other two colors.

Netting this out, this means that color CMOS sensors are doing the best at reproducing Green, followed second by Red, and finally by Blue which is the worst of three from an overall noise perspective.

Looking toward the future, note that these limitations with the blue channel are really mostly a matter of a cost/performance optimization. That is, there is nothing inherent to physics that requires blue performance to be worse, only that it would be MUCH MUCH more expensive given current device constructions to improve the blue channel by a noticeable margin. Also, given that the human eye is not very sensitive on the blue/yellow color axis the solutions are already a very well optimized solution. In fact, I'm sure most camera makers would prefer total cost to drop first before paying the same or more just to improve blue channel noise performance.

**Bayer chose to setup the matrix this way because the human visual system gets the majority of it's luminance signal (ie: brightness information) from the green part of the color spectrum. That is, the rods in the eyes are most sensitive to green light making the green part of the spectrum the most important visually.*

Tall Jeff
  • 421
  • 1
  • 4
  • 5
  • 4
    Yeah, more on the background: manufacturers weight their chips in this way is because they're approximating the colour sensitivity distribution of the human eye: our eyes are about 50% as sensitive to red as to green, and about 20% as sensitive to blue. That's why colour-to-greyscale conversions are weighted the way they are, typically in the realm of (0.2989r + 0.5870g + 0.1140b). – Jon Purdy Apr 10 '11 at 19:53
  • Presumably Foveon sensors do not exhibit this behaviour. – Marcin Apr 14 '11 at 09:37
  • @Marcin: why not? – mattdm Apr 19 '11 at 12:01
  • 1
    @Tall Jeff: I'm a bit concerned that this answer, while highly rated, is in direct contradiction to the other two. That is, you say that there's nothing inherent to physics which makes blue performance worse, whereas the others say it basically comes down to that. Which is right? – mattdm Apr 19 '11 at 12:36
  • 1
    @Mattdm: Because Foveon sensors don't use mosaicing, and have equal amounts of sites for all three channels. – Marcin Apr 19 '11 at 15:07
  • @mattdm - I don't think the other answers are in conflict. Certainly @Marcin hit on part of why. SUMMARIZING: @conslayer's answer really just talks to why most scenes have less natural blue light at the source. This contributes to blue being typically noisier simply because there is less signal to start with, but a sensor optimized exclusively for blue would have no trouble getting a clean blue image. The @ShutterBug answer is more talking to human eye's inability to detect blue very well and why systems are optimized to take advantage of that and are optimized for viewing as such too. – Tall Jeff Apr 19 '11 at 23:44
  • Also CCD suffers from blue channel noise more than CMOS. In fact I dont think for CMOS this is a noticeable problem at all. – fahad.hasan Apr 20 '11 at 11:18
  • @mattdm, @Marcin: The explanation about Foveon is partly true, however that is not the sole reason. Foveon sensels put the blue photodiodes on the top layer, where they directly receive incoming light. Normal bayer sensors tend to have circuitry that interferes with the photodiodes of each photosite, and that circuitry absorbs some light, particularly blue. Back-illuminated bayer designs suffer much less from this problem. – jrista Apr 20 '11 at 23:44
  • @Tall Jeff: so, why would it cost more to make sensor optimized for blue? Not necessarily specially optimized, but made such that blue isn't noisier than red? – mattdm Apr 23 '11 at 01:45
  • @ShutterBug: it is empirically true that the blue channel is noisier when looking at output from my CMOS-based Pentax K-7. – mattdm Apr 23 '11 at 20:27
15

In addition to the sensor response discussed by Tall Jeff, most scene illumination (sunlight, incandescent) is deficient in blue light relative to green and red. Fire up this Java blackbody simulator and see that blue is lower than green or red for color temperatures of interest (~5500 K daylight, ~3000 K incandescent).

There's another small factor that compounds the problem. CCD and CMOS arrays are photon-counting detectors. Most plots, including those in the blackbody simulator above, show spectral energy density, not photon counts. Blue photons are more energetic than red photons, by the inverse ratio of their wavelengths, so for the same energy value on the plots, you would get about 25% more red photons than blue photons. And that's the starting point for the sensitivity effects Tall Jeff describes.


Regarding CCDs and backside-illuminated sensors, frontside-illuminated CCDs do suffer from the same diminished blue sensitivity, as much of the blue light is absorbed while passing through the non-sensitive gate structure of the chip. Backside-illuminated sensors will see an improved blue response. See this typical spectral response curve (for various types of research-grade CCDs).

coneslayer
  • 7,406
  • 3
  • 37
  • 55
  • 1
    Not to mention that a lot of the blue gets scattered by the atmosphere, especially in the best hours to photograph (i.e. sunrise and sunset). – Agos Apr 10 '11 at 22:34
3

Because human eyes/brains are not as sensitive to changes in blue light as they are to changes in green/red lights. The modern camera sensors acts more like human eyes and therefore its less sensitive to blue than it is to green/red. Since the standard for displaying neutral on color monitors is to have equal amounts of blue, green, and red, and since the sensors are less sensitive to blue than to red and green, it is convenient to amplify the blue channel. Amplifying the blue channel signal also amplifies blue channel noise.

Camera noise reduction is only applied if you're shooting JPEG but since a lot of people shoot RAW, the blue channel is always somewhat noisy. I've searched for a remedy to this problem. One suggested to convert the image to lab color and smooth/blur only the luminance channel, then convert back to RGB to remove noise. You can try.

fahad.hasan
  • 5,729
  • 2
  • 28
  • 41
  • So, you're saying that modern camera sensors are less sensitive to blue intentionally, because that models the human eye better? – mattdm Apr 18 '11 at 12:38
  • Its the nature of the blue light that makes the sensors as well as human eyes less sensitive. In order to sense blue light properly you need to amplify it which amplifies noise as well. – fahad.hasan Apr 19 '11 at 03:51
  • Why? What is it about blue light? And if we're less sensitive to it, why would you need to amplify it more? (As opposed to sensors being less sensitive, which almost-obviously requires more amplification.) – mattdm Apr 19 '11 at 11:52
  • The sensor is actually designed for peak sensitivity. By comparison the spectral sensitivity is typically down by a factor of 2x at the blue end of the spectrum. The gain is turned up on the blue channel to compensate for the lack of shortwavelength sensitivity which means that thermal noise in that channel is also amplified along with the signal. The same is true but to a much less noticeable extent in the red and green channel. – fahad.hasan Apr 19 '11 at 12:09
  • That makes sense to me. (Can you edit your answer above to include it? I think the part about "human eyes and therefore" is somewhat confusing.) – mattdm Apr 19 '11 at 12:38
  • 1
    Also, in @Tall Jeff's answer, currently voted to +20, he says "there is nothing inherent to physics that requires blue performance to be worse", which seems in direct contradiction to this, leaving me a bit confused. Can you help straighten me out? Thanks. – mattdm Apr 19 '11 at 12:40
  • The thing is, we have a single sensor to sense all different lights of different wavelengths. Since human eyes are more sensitive to green than to blue, sensors are designed the exact same way to replace the image we see in our eyes. Now, its normal that if a sensor is designed to sense the peak wavelength from green, it will face trouble sensing blue as blue light lack shortwavelength. So, sensor designers boost the blue which also boost noise. Its not actually a problem of blue light. Its how sensors are designed. I'll edit my answer once your confusion is clear. – fahad.hasan Apr 20 '11 at 04:11
  • Wait, hold on; isn't blue light shorter wavelengths than green or red? And speaking of red, by this explanation, why isn't the red channel as noisy as the blue? – mattdm Apr 20 '11 at 11:35
  • Now, I'm a little confused too! I again have read through a few light related articles and looks like I didn't understand the previous articles correctly, my apologies. Blue: 475nm, Green: 510nm and Red:650nm. So, to put it simple, since the green lies in the middle of the spectrum, sensors are designed to sense it properly. Red light has longer wavelength and they can downscale it without visible noise, but for blue, which has the shortest wavelength, they need to upscale it, which causes the most noise. – fahad.hasan Apr 20 '11 at 12:10
  • Hmmmm; that last may explain why the red channel is most susceptible to clipping — http://photo.stackexchange.com/questions/10735/why-do-bright-red-flowers-end-up-without-details – mattdm Apr 20 '11 at 12:31
  • @ShutterBug — I think this is probably closest to a real, meaningful answer. Basically, current sensor tech can only be responsive to a limited range of light, and since green/yellow is a) in the middle and b) so important to human vision, sensor response is centered there, at the expense of both longer and shorter wavelengths. Does that sound right? I don't understand exactly what you're saying about upscaling and downscaling wavelengths, though. That sounds like something totally different. Do you have references to some of those articles? Thanks! – mattdm Apr 23 '11 at 01:41
  • While it's not a down-to-the-deep-science "why" answer, Goethe's observations on color weight might be relevant here. http://books.google.com/books?id=ofvRhNBgoCoC&pg=PA59&lpg=PA59#v=onepage&q&f=false – mattdm Apr 23 '11 at 18:55
  • @mattdm: Downscale means converting lights having longer wavelength to make them sensible for sensors designed to sense shorter wavelength. The opposite is for Upscaling. May be someone who had better grade (not me!) in high-school physics can explain these terms better! – fahad.hasan Apr 24 '11 at 03:57
0

We have done an analysis of the blue-green-red channels of a DP3 Merrill in digital (RAW) mode. I just purchased this camera in June, 2018. The blue channel exhibits a level dependent error in the a/d converter that is not present in the red-green channels, which function as expected. It appears that there may be an error in either wiring the blue channel a/d or in the code which translates the a/d voltage to the blue channel digital signal. It is NOT a sensitivity issue. It could be a saturation issue, i.e. physical voltages are exceeding the a/d range at very low voltages, i.e. too much gain in that channel. The camera was set at ISO 100 for acquiring data, and data were acquired over a range of shutter speeds and signal levels over a frame. The blue channel measurements were most nearly correct signals at the LOWEST signal levels. The higher the signal, the greater the error. It is a gain/digitization problem in the algorithm producing the X3F files, or perhaps a byte ordering problem. We are looking at the X3F files directly to see if the error is already present there, but I expect it is since both the TIFF and JPEG files produced by the converter have the same issue. It is a question if the manufacturer will be interested to correct this problem? The Foveon chip is a good idea that needs to be engineered properly.

  • This is a follow-up to the above comment. By converting directly, the x3f file, avoid the Sigma conversion utility,we find that the data is correct in ALL channels. The problem is in the blue channel conversion to tiff/jpg. We are looking to see what the error is, but probably a byte swap for that branch of the conversion. Tests have been made in several ways, and the camera output is what one should expect, given the sensitivity and mean absorption path for rgb photons in the camera. – cmitylliam Jun 18 '18 at 05:01
  • Hi, what tool are you using? https://github.com/Kalpanika/x3f/releases ? – biziclop Jun 28 '18 at 07:32