5

I've been shooting quite a few gigs at clubs/concerts and repeatedly experience issues with scenes lit by small bandwidth lights (for example lasers).

These lights cause a single channel to oversaturate, yielding weird artifacts. From theory, I'd expect a very strong light to bleed into the other channels, leveling off at white. Here's a simulation of what I expect, rendered in Filmic Blender: enter image description here Every rectangle emits RGB(0,0,1) with luminosity increasing from left to right.

Instead it seems like the camera just clips off the blue at maximum value. If at all, it'll switch instantly from blue to white. Here's one example:

enter image description here

and there are some more on Imgur.

In order to avoid these problems, I usually coordinate with the organizers/lighting to make sure there is always tungsten lighting available on stage. This requirement limits my freedom a lot and I'd like to avoid it.

Is there some image processing, for example mapping RGB(0,0,1) to white that'd help? Is this just a hardware limitation of all the cameras I've used until now (various Canon DSLRs and the Sony a7 lineup)? These images were recorded in the Adobe RGB color space, is this a limitation?

Philip Kendall
  • 21,867
  • 6
  • 67
  • 101
DeinFreund
  • 53
  • 3
  • 1
  • Are you aware that a laser (even reflected on some guys cell phone) can permanently destroy your camera’s chip within a millisecond? Google it, many photographers have ruined 5000+$ cameras that way (I know two of them) – Aganju Oct 05 '17 at 01:25
  • @Aganju yes, the lasers in the sample pictures were actually controlled by a friend of mine. Of course, there is still a risk of reflection which is why I use my "stunt camera" to record these. – DeinFreund Oct 05 '17 at 09:35
  • One thing to be a aware of: Some lasers used for light shows (HOPEFULLY never ever for crowd scanning, or you certainly shouldn't be there as a photographer!) are repurposed industrial pulsed/"q switched" lasers. Google "vag burner". Such exhibit extreme peak brightnesses that might be even beyond some studio strobes. Might confuse the hell out of some hardware. – rackandboneman Dec 10 '18 at 22:08
  • Addendum: Rumours have it people getting hit in the eye with one hear a similar sound in their skull as you get when firing a strong speedlight straight into black paper held against it.... – rackandboneman Dec 10 '18 at 22:10

3 Answers3

3

This is a hardware limitation. The way they are engineered CMOS sensor expect all three colors to behave like in nature in natural light: No color is extraordinarily bright and clips because of its high brightness levels.

In order to avoid the clipping in the blue channel you can

  • use a filter in front of your lens to lower the brightness of the blue color
  • ask the lighting professional to aim for a more even light.
oa-
  • 176
  • 1
  • 1
  • 6
  • This is not a hardware limitation with most modern cameras. RGB histograms are readily available on the back of most cameras as soon as a photo is taken. This is a photographer skill and experience limitation. – Michael C Oct 04 '17 at 18:32
  • @MichaelClark can you be more specific? These are shots from my first ever gig to show the most extreme case, I've been able to slightly lessen the impact by optimizing exposure. This still doesn't fix the underlying problem. – DeinFreund Oct 04 '17 at 19:40
  • As written in the question, I usually speak with the lighting director beforehand. This doesn't work for laser shows. The filter idea sounds interesting, I'll be looking forward to try that out. The a7 lineup seems to mostly struggle in the blues, so it shouldn't be too bad to have a little more of the other two channels. – DeinFreund Oct 04 '17 at 19:45
  • 1
    @MichaelClark - I'm not sure if you and oa are talking about the same thing. It sounds to me like oa is referring to the fact that the bayer filter design means that modern digital cameras can't deal with lighting being extremely strongly colored. For example, underwater shots can't simply be corrected with white balancing because the amount of blue light is too high relative to the amount of red and green. You would need to filter out the blue light to bring things back within the dynamic range of the sensor. – AJ Henderson Oct 04 '17 at 19:46
  • 1
    The blue won't clip to white unless there is either a) enough R & G for all three channels to clip or b) enough blue light to cause blooming in R & G filtered pixels adjacent to the blue filtered pixels. The second condition seems to be the case here. The solution in either case is to reduce exposure of the blue light, by reducing total exposure or by adding more filtering (than provided by the Bayer mask) to the blue light. Unfortunately, in the case of very narrow band light sources, filtering out the blue will filter out pretty much all of the light in the background of the sample image. – Michael C Oct 04 '17 at 20:44
3

This can frequently be the byproduct of what could essentially be called camera metamerism; a particular mix of certain wavelengths that pin one of the CMOS bins in an irregular way. This is easy to understand once one realizes the likelihood that a particular laser just happens to perfectly match the colour filter on the CMOS photosites; It would be virtually impossible for such a thing to happen. The ungraceful white is due to the camera software correctly considering all wells peak saturated when a single value maximizes. If the decoding software didn't do this, you would see horrific chromaticity skews for any colour that has been pinned to full, while rendering the other wells "as is".

The problem is exacerbated when shooting in wider gamut spaces due to the (typically) absolute or relative colorimetric transform to a smaller gamut space.

If you are wondering how to negotiate it, you could use the approach ACES does for the same issue with some cameras, and apply a carefully constructed matrix to desaturate the troublesome colours. This is still an issue even for ACES, and further solutions are being experimented with.

Another option might be a custom 3D LUT that forces a more graceful desaturation while attempting to hold the chromaticity axis between the original colour and the white point. Most digital cameras are exceptionally trash at this, and instead end up doing poor to no colorimetric desaturation.

Regarding the white hard clip, it is possible you could try dcraw to render the debayered image without the hard cut, but it may result in equally alien looking imagery, context depending.

troy_s
  • 143
  • 4
1

You could probably accomplish this with a custom RAW processor, but fundamentally this is just an artifact of the way that the image processor elects to handle clipped values. Even if you made it "non-white" it is still going to be an odd artifact as the color can't be properly defined as it is out of range of the sensor. The over-saturation of blue results in overexposing on photosites covered by blue filters while still trying to get sufficient green and red information to form a full color image.

The image processor chooses to display these clipped values as white since it doesn't know how to mix with blue (relative to the red and green that it has values for) The color of the pixel could have been just barely clipping or clipping by a lot which would impact the apparent color that should be displayed, so the color is not able to be meaningfully defined with an unknown blue component.

You need to either limit the amount of blue light entering the lens (color filter) or increase the amount of light for the other colors relative to the problem color. This will allow you to get sufficient exposure of red and green without over-exposing on blue before you have enough color information. This is, of course, dependent on having enough light in each of the colors your camera sensor is sensitive to in order to get an exposure. If there is truly only blue light then filtering will just block the light, but if there is a small underlying amount of full spectrum light it can bring the blue down enough to allow proper exposure of all color channels.

AJ Henderson
  • 34,864
  • 5
  • 54
  • 91
  • This matches my experience. Lower exposure allows the camera to deal with the blue better, but also adds even more noise and banding to the shadows. This trade-off is especially strong in videography which I do most of the time now. I'll make sure to try filters. – DeinFreund Oct 04 '17 at 20:01
  • Yeah, it's an even more common problem in underwater photography and videography. Btw, if you do video as well, check out Video Production as well. – AJ Henderson Oct 04 '17 at 20:07
  • 1
    If the only source of light is a narrow spectrum blue light, adding an orange filter will do the exact same thing as reducing exposure in the camera. If all of the light is the same color, filtering that color light out will reduce total exposure. You can only filter out blue to increase R & G if there is some R &G in the light source. – Michael C Oct 04 '17 at 20:31
  • The color is not out of range of the sensor. The intensity with which that color was recorded is out of range of the sensor. – Michael C Oct 04 '17 at 20:33
  • @MichaelClark actually the color IS out of range for the sensor. The dynamic range is exceeded so you either lose red and green to crush out blue to clip. The color is outside the range the sensor can capture if two channels are two different in intensity, which is precisely what causes heavily tinted lighting to be a problem if you are trying to capture color without the tint. – AJ Henderson Oct 04 '17 at 22:40
  • Also, from the sample image, there is some light other than the blue. It's just drastically overpowered. It probably is worth clarifying that further in my answer so it isn't left to deduction to figure that out. – AJ Henderson Oct 04 '17 at 22:43
  • If the source of the blue light in the background is a true laser, there is absolutely no green or red in that light at all. It is single wavelength light. Therefore the color of that light, when properly exposed is within the camera's range. The question is not asking about the fuller spectrum light in the foreground. It is asking about the blue light that has been clipped to white due to blooming caused by overexposure of a single wavelength of light. – Michael C Oct 04 '17 at 23:08
  • The same thing would have happened if there had been no light in the foreground. The most intense areas of blue would have been clipped to white due to blooming, not due to R & G light that is not present. – Michael C Oct 04 '17 at 23:09
  • Ok, in the context of just trying to deal with the laser I agree, but if he wants the image to look basically like it looks here, then it goes out of range if you want to resolve the overall image. It depends on his intent which isn't completely explained. – AJ Henderson Oct 04 '17 at 23:14