3

I was shooting in a gallery that used what appears to be a multi-colored projector and had difficulty shooting in that type of scene as the color was not even across the scene. The image shown below is what I mean, you see traces of green, blue, and yellow colors in the picture when it's supposed to be all white/gray as seen by the naked eye.

Why does this occur to the camera's output (DSLR or mirrorless) when shooting a scene that's lit by a projector? Is it the type of projector used? Is there a way to fix this or camera settings wise?

enter image description here

4 Answers4

9

It's the difference between how your camera and your eye/brain see the scene.

Many projectors scan each color channel in sequence very rapidly. It's too fast for our eyes to pick it up, as our brains average the sequence of each color channel. But if our camera is using a shutter time that is shorter than the refresh rate of the projector, it only sees part of the color sequence.

This is particularly true of most consumer cameras that use either a mechanical shutter that starts exposure on one side of the sensor and ends on the other or an electronic shutter that scans each line on the sensor in sequence from one side to the other (or from top to bottom). With the mechanical shutter, faster shutter times are exposed with the second curtain chasing the first curtain across the sensor. Only a narrow slit is being exposed at any one time. As the slit between the two shutter transitions the face of the sensor, the projector is flickering between each of the color channels it uses. Even with a shutter time of 1/8000 second, it still takes about 1/250-1/400 second for the slit between the two shutter curtains to transit the entire sensor. Electronic shutter is similar as each line of the sensor is read in sequence. When the last line is read then the camera starts reading at the beginning again.

The solution is to increase exposure time long enough to cover a full cycle of the projector's sequence as it goes through the color channels. In order to prevent overexposure, you may need to reduce ISO, close down the camera's aperture, or even use a neutral density filter.

Michael C
  • 175,039
  • 10
  • 209
  • 561
  • If this was the cause, you would expect colour banding in the direction of the shutter movement. On the example image, the yellow colour cast can however only be seen in the upper right corner, which is surrounded by an L-shaped blue area. I am not sure what causes the colour cast here, but colour flickering and shutter movement is obviously not the cause. – jarnbjo Oct 22 '18 at 12:49
  • @jarnbjo The yellow cast is also in the shadows in the lower right corner as well. Having shot tens of thousands of frames under stadium lights flickering at 120Hz, I can tell you that the rolling shutter effect due to changing color as the slit between the curtains transits the sensor is not always uniform all the way across the frame. For whatever reason, the area without yellow in the center on the right could have been dark in the projector's image when yellow was being scanned. – Michael C Oct 22 '18 at 15:27
  • In which case you must have been subject to some other effect and believed that it has to do with light flickering and shutter movement. Interference between flickering lights and a moving shutter (or the readout of the sensor) will cause banding or stripes at some angle. It is simply not possible for such an effect to apply to rectangular areas of the image as here. – jarnbjo Oct 22 '18 at 15:37
  • @jarnbjo Please look at the image again. There is a yellow cast in the lower right portion of the frame as well, it's just darker in the shadows. the three major bands (blue on left, white in center, and yellow on right) are in the same orientation as the shutter curtain edges as they transit the camera being held in portrait orientation. If the projector does not project much yellow in the middle of the band on the right (because there's less yellow to be projected at that spot in the image it is projecting), then it won't show up. – Michael C Oct 22 '18 at 15:48
  • @jarnbjo Please see Bryan's test images in this article at The-Digital-Picture. The bands look fairly rectangular, which you say is impossible, to me. The difference with a projector is that it is not projecting a uniform field of each color, but certain areas of the projection are brighter or dimmer for each color based on the scene content it is projecting. – Michael C Oct 22 '18 at 15:55
  • If, for example, an image only has blue in the upper half (sky) of a desert landscape with orange rocks and sand in the lower half of the frame, when a projector renders that scene there would be very little or no blue light energy in the lower half of the image when the projector is projecting the blue channel. – Michael C Oct 22 '18 at 15:58
  • Are we looking at the same pictures? Your textual description (blue on left, white in center, and yellow on right) does not match the example image from the question and I see no distinct rectangular patches of light on the images in the article you are linking to, only uniform bands or stripes going through the entire image. – jarnbjo Oct 22 '18 at 15:58
  • The bands in Bryan's test images are about as well defined rectangles as the bands in the example image above. The difference is that the flickering lights Bryan illustrates are flickering light sources due to variations in the voltage flowing through a gas filled tube that can't project an image. At each spot on the target, the projector projects differing amounts of energy for each color channel based on the contents of the scene it is projecting. If there is not much yellow in the scene in the middle of where we see the yellow band, there will be a gap in the band there. – Michael C Oct 22 '18 at 16:07
  • I can just repeat my question: Are we looking at the same pictures? Any alleged similarity does not suddenly appear just because you repeat your claim. – jarnbjo Oct 22 '18 at 16:11
  • I have no way of knowing what you are seeing. I am accurately describing what I am seeing. – Michael C Oct 22 '18 at 16:15
  • I have emphasized the hue of the posted image. Don't you see the blue L-shaped area I am talking about? After making the hue more visible, it is also obvious on the floor, that the areas of colour cast do not match the interference pattern you would have seen if it was the result of flickering lights and shutter movement. https://pasteboard.co/HJEMyRY.jpg – jarnbjo Oct 22 '18 at 17:09
  • The floor is reflecting what is in the display above. If the projector is not aimed at the short wall beneath the display, why would we expect the wall to be affected by light that is not shining on it? Again, a projector differs from a flickering light source in that the same colors are not projected over the entire field. Each "pixel" has its own level of intensity for each color channel. If there is no yellow in the bottom half of the picture the projector is reproducing, then the yellow band will not extend all the way to the bottom of the part of the image illuminated by the projector. – Michael C Oct 23 '18 at 01:18
  • Your "blue" L (which should probably be white if the image were properly color balanced) could be the result of a light source other than the projector that is being overpowered by the projector in the areas where the greenish blue band on the right and the yellow band on the left are manifested. – Michael C Oct 23 '18 at 01:26
  • 1
    @jarnbjo Projectors and intelligent stage lights can produce infinite variations of time-multiplexed patterns that mess with any kind of fast or rolling shutter... – rackandboneman Aug 04 '19 at 01:12
5

If multiple projectors were used, the white balance may not have been consistent among all of them. They may have initially been set to match, but the color temperature can change as the lamps age. We don't normally perceive the differences because our brains compensate. The differences among light sources can also be exaggerated by increased saturation settings.

You can try decreasing saturation. Use layer masks to limit the adjustment to the affected areas.

xiota
  • 26,951
  • 4
  • 39
  • 126
1

My guess is the projectors are not projecting the image in one hit like a film projector, but are scanning or interlacing red/green/blue much the same as TV's do, and your camera is faster than your eye.

If you slow the shutter speed to more than 1/10s it might "fix" the problem as your shutter is open for long enough to capture at least one full frame.

You can google how projectors & TV's work if you need a bit more explanation.

John U
  • 195
  • 8
1

Projected images are designed to be observed by human eyes. Projectors and screens apply additive color theory to match colors in the real world. When you take a photograph of a real scene. That scene contains the full spectrum of light typically. A screen instead will only contain a set of primaries which are mixed together to create what will appear to be the pure wavelength of those real colors. But only to a human observer.

For example:

In real life if you photograph a banana, it's spectral reflection will be real yellow light at around 580nm. Where if you photograph a banana on a screen, the screen will transmit a metameric combination of red (650nm) and green (550nm) light which produces that same 580nm color. But the problem is that the combination only produces a color that appears to match 580nm to a human observer.

Here's a bit of basic color theory.

Spectral power distribution of light source(λ) * Spectral reflectance / transmittance of object(λ) = Color stimulus(λ)

Color stimulus(λ) * CIE 1931 color matching functions(λ)(c) = XYZ tristimulus values

The tristimulus values define the spectral responsivities of the human eye when viewing a particular color stimulus. The color matching functions represent the chromatic response for each cone in the human eye. What's important to note here is that 580nm light and a combination of 650nm and 550nm light will produce the same color to a human observer if the tristimulus values are identical.

Since projectors are designed to metamerically match to a human observer, digital camera sensors on the other hand have their own set of spectral sensitivities determined by the quantum efficiency for each rgb filtered photosite.

Color stimulus(λ) * Spectral sensitivity of camera sensor(λ)(c) = camera spectral responsivities(λ)(c)

Those camera spectral responsivities are different to the spectral responsivities (tristimulus values) of a human observer. Thus the combination of RGB light that was used on the projector to produce white to the eye, does not produce white on camera.

srb633
  • 121
  • 2