2

I understand what colour balance is and how it is meant to be used. However I don't fully understand why we see what we see if an incorrect White Balance is used.

If for example we take a photo out in bright daylight, but have the white balance set to 3200K, that photo will appear "cooler" or to have more blue.

What is the camera trying to do that results in the blue colour? Why not any other colour?

The fact that there is more blue to me says that is either trying to reduce the other colours, or it's trying to increase the level of blue, but I can't reason why it would be trying to do either.

I have looked at this question but either I didn't understand it, or it didn't answer quite the same question I have: What *exactly* is white balance?

James
  • 123
  • 3

1 Answers1

3

White balance is applied while processing raw data. The purpose of white balance is to reach R = G = B for neutral (grey, non-colored, achromatic) areas of the image. The whole problem arises from the fact that color channels of the sensors (for typical Bayer, those are R, G1, B, G2) have different "sensitivities", and the responses also depend on the spectrum of the light. Typically, for daylight green channels are about half a stop to one stop more sensitive than red and blue; for incandescent light the responses in green and red channels are close to each other, while the response in blue channel is lagging behind. So, to equalize the responses white balance coefficients are applied through the multiplication of the linear raw data in respective channels. To put some numbers to it, here are the white balance coefficients for Olympus E-M5 camera, in R, G1, B, G2 order:

// Olympus E-M5 CameraGroup= 6
{"Olympus", "E-M5", "Tungsten", {1.296875f, 1.0f, 3.265625f, 1.0f}},
{"Olympus", "E-M5", "3300K CCT", {1.546875f, 1.0f, 2.578125f, 1.0f}},
{"Olympus", "E-M5", "3600K CCT", {1.640625f, 1.0f, 2.367188f, 1.0f}},
{"Olympus", "E-M5", "3900K CCT", {1.734375f, 1.0f, 2.203125f, 1.0f}},
{"Olympus", "E-M5", "FL-W", {2.000000f, 1.0f, 2.601562f, 1.0f}},
{"Olympus", "E-M5", "4300K CCT", {1.851562f, 1.0f, 2.125000f, 1.0f}},
{"Olympus", "E-M5", "4500K CCT", {1.921875f, 1.0f, 2.148438f, 1.0f}},
{"Olympus", "E-M5", "4800K CCT", {1.976562f, 1.0f, 1.945312f, 1.0f}},
{"Olympus", "E-M5", "Daylight", {2.078125f, 1.0f, 1.820312f, 1.0f}},
{"Olympus", "E-M5", "Cloudy", {2.281250f, 1.0f, 1.640625f, 1.0f}},
{"Olympus", "E-M5", "6600K CCT", {2.304688f, 1.0f, 1.734375f, 1.0f}},
{"Olympus", "E-M5", "Shade", {2.476562f, 1.0f, 1.437500f, 1.0f}},
{"Olympus", "E-M5", "Flash", {2.351562f, 1.0f, 1.617188f, 1.0f}},

To convert to photographic stops / EV, calculate log2 of the numbers. From above you can see that if the image is taken in daylight, the blue channel needs to be multiplied by 1.8. If the white balance is set to Tungsten it will be multiplied by a much larger amount, 3.3 times. That is why it will appear bluish.

Michael C
  • 175,039
  • 10
  • 209
  • 561
Iliah Borg
  • 2,031
  • 10
  • 13
  • Thanks so much, this has been bothering me all day but this made it click. If I understand you correctly this would mean the sensor doesn't have a base or native white balance at all? – James May 03 '15 at 16:28
  • @James : No, sensors do not have native white balance under any normal light sources. Daylight or flash with a magenta filter like CC40m result in something close to native white balance. So is incandescent with a strong blue filter. – Iliah Borg May 03 '15 at 16:31
  • Most sensors do not record any color information at all. They record monochromatic luminance values for each pixel. The pixels are filtered for Red, Green, or Blue light, but there is only one luminance value for each pixel. The concept is similar to using color filters when shooting with B&W film. A red filter will cause the red light passing through the filter to have a higher tonal value (brighter gray) on the film than equally bright blue or green objects. – Michael C May 03 '15 at 17:27
  • To get color from that information, the demosaicing algorithm used by your camera (JPEG) or computer's photo conversion application (raw) compares the relative brightness of adjacent pixels filtered with the different colors and interpolates R, G, & B values for each pixel. – Michael C May 03 '15 at 17:29
  • @ Michael Clark - they do record information about colour. The representation of colour as intensities is no different from colour TIFF. TIFF with three channels, each of which is "monochromatic", or Bayer raw with 4 channels, each of which is also "monochromatic", differ only spacially. – Iliah Borg May 03 '15 at 17:46
  • @Michael Clark : to get colour, a colour transform needs to be applied. Demosaicking only restores missing intensity values. Binning is a sort of demosaicking resulting in full triplets, and only colour and tone transforms are necessary to assign proper interpretation. – Iliah Borg May 03 '15 at 17:48
  • It all depends on how you define color information. Most people assume each pixel records a red, a green, and a blue value. That is not the case (other than with specialized non-Bayer sensors). Each pixel records one luminance value, thus it is monochromatic. Even though some green and blue light will, for instance, make it to the bottom of a red filtered pixel well and vice versa. All each pixel senses is the total amount of light that reaches the bottom of the well, regardless of what color it is. There are no missing intensity values. Only interpolated color with a Bayer sensor. – Michael C May 03 '15 at 18:11
  • @Michael Clark : well, when we are looking at a CMYK raster print, do we consider it to be a colour image? And on the same note, colour display? – Iliah Borg May 03 '15 at 18:54
  • @Michael Clark : Each channel is monochromatic, be it common TIFF or common raw. When we perform demosaicking, we may want to restore spatial properties, filling missing intensities for colour channels. That is what we interpolate. It is not the same as interpolation of colour. Many demosaicking algorithms are colour-agnostic, and in testing they perform at least as good as those that involve colour transforms to Lab, YCC, and similar. – Iliah Borg May 03 '15 at 19:02
  • @IliahBorg The data in a raw file is not the same as a CMYK or TIFF file. All of the values in a raw file are 0-255 (for an 8-bit depth). There is no 255B, 255G, 255R. There is only a number. With a TIFF or CMYK or any other type of raster image there are specific values for individual colors. Not so with raw data. The difference in intensity between adjoining pixels filtered for different colors is used to derive color, not the bare luminance values themselves. – Michael C May 03 '15 at 22:11
  • @Michael Clark : Dear Michael, data in raw file is exactly that - intensity of red, green, blue, secondary green. It can be normalized as 0..255R, etc. If you will be interpolating from CMYK or RGB raster, it will be the same, using intensity difference, because the implicit hypothesis under it is that intensities change slow. – Iliah Borg May 03 '15 at 23:42
  • Have you ever taken photos using color filters and B&W film? The filters don't restrict all light other than the color of the filter from passing, they only reduce the intensity of that other light. The same is true of Bayer filters. If you have a pure red light source just bright enough to saturate the red channel the green pixels will still detect some (but not near saturation) of that light and even the blue filtered pixels will detect a much smaller amount of it. The interpolation of color will take into account the sensitivities of the other pixels to red light and compensate for it. – Michael C May 04 '15 at 01:34
  • @Michael Clark : Yes. That is true with the colour transparency film as well (and btw it is a stochastic raster image too, with spatial voids). The matter is solved by assigning proper colour space. Colour is not interpolated, intensities are spatially interpolated (or not, in case of binning). At the beginning those who were coding raw converters tried to present the products to the public as something "black magic". We are well past that, in fact in 2003 it was already obvious that raw data constitutes a normal image. – Iliah Borg May 04 '15 at 01:55
  • http://en.wikipedia.org/wiki/File:Alleia_Hamerops_composite.jpg – Michael C May 04 '15 at 03:37