What is the area of CIE 1931 diagram that is being covered by contemporary digital camera sensors? I can find triangles for LCD panels being comapred to sRGB and aRGB and ProPhoto. There are always specifications on panels covering like 98% of sRGB, or 72% of NTSC, etc, but I can't find sensor color coverage.
-
1In Jristas answers to two questions he answers at least some of this: How much of a difference do different color spaces make?, and more of it in this: What are Color Profiles and where would I find information on using them properly?. In particular: When using RAW, the full color range supported by the camera sensor will be available, which in modern digital cameras can greatly exceed the sRGB or Adobe RGB gamut. – dpollitt Feb 19 '15 at 02:19
-
Yes that's what I've guessed, but are there any graphical representations of that? New cameras have 14-bit converter which means 16k levels per channel? When sensor voltages are being translated into binary, what acutal colors are represented with (16k, 0, 0) or (0,16k, 0). I guess these colors are not in sRGB color space, but somewhere out. – Cornelius Feb 19 '15 at 14:53
-
You might ask the guys at Rawdigger. I think you could use that (read the sensor data without interpretation) with photos of known calebration targets to find out. An it8 target comes with a file of the exact values of each patch measured with high-end equipment. You can plot the patch's values on a graph of 14-bit sensor values and extrapolate tomthe most extreme. – JDługosz Feb 19 '15 at 18:06
-
If you're interested in what actually ends up in a RAW file created by the camera, it will vary by the camera model. This is one of the reasons it can take some time for RAW formats from newer models to be supported in third party RAW converters. If you want to do a high quality conversion you usually have to create a color profile for the camera unless the camera natively writes DNG, which includes the device color profile as part of the meta data. – ColleenV Feb 19 '15 at 21:11
-
@Cornelius > what acutal colors are represented with (16k, 0, 0) or (0,16k, 0) - you need to normalize sensor output somehow, a familiar way would be dividing by max readout and multiplying by 2^8-1 – Iliah Borg Feb 21 '15 at 17:24
1 Answers
What is the area of CIE 1931 diagram that is being covered by contemporary digital camera sensors?
The raw values are not colors per se and the concept of gamut is not working well with raw output of digital cameras. The data become colors after raw development, which depends on many factors.
That said, I was wondering what is the outcome of my camera, combined with Lightroom processing, my particular camera profiles and my typical "neutral" processing. I selected couple of images and plotted their individual color values into xy diagram.
The triangles represent sRGB (smallest), AdobeRGB, ProPhoto RGB (largest). The images I used are below the charts. These are not complete gamut plots, but I hope they help to illustrate the range.




- 5,205
- 11
- 10
-
This diagram will not work well to characterize a camera sensor. Any bright enough color point in the diagram that you can produce, (and it is you that have to produce it) will result in a raw value from the sensor. This value is not right or wrong by itself. – Wirewrap Feb 20 '15 at 10:44
-
Gamut in a digital camera would be the color space supported by the camera sensor's response to light with respect to it's noise levels. Gamut is effectively the SPACE within which colors can be modeled. Newer cameras, particularly those with sensors using more modern manufacturing techniques and with much lower noise (Sony Exmor, Samsung's NX1 sensor) are going to support a much broader range of color than older sensor designs that still suffer from high noise, such as Canon sensors. To really measure a camera's color response, you need to test it with extreme colors. – jrista Feb 20 '15 at 18:25
-
@Wirewrap Sure, it characterizes outcome of the complete workflow, where the sensor and generated raw values are just the starting point. Sorry if this was not clear from the answer. – MirekE Feb 20 '15 at 18:42
-
@jrista The diagrams above are from pretty outdated sensor. My original intention was to map larger selection of various highly saturated colors and use more modern sensors, but I came to conclusion that this is pointless from practical point of view - even with this old and noisy sensor of Leica M9 the outcome of the workflow already reaches boundaries of the largest practical RGB color space in multiple directions. – MirekE Feb 20 '15 at 19:06
-
I wasn't addressing the gamut maps, I was responding to this: "The raw values are not colors per se and the concept of gamut is not working well with raw output of digital cameras." I disagree that the concept of a gamut does not work well with digital cameras. The gamut of a digital camera would be bound by the sensor's color response and noise levels. Each image would then be limited in gamut (color extent) by that response. – jrista Feb 20 '15 at 22:19
-
@jrista Thanks for your response. How do you translate that into xy coordinates of CIE 1931 diagram? – MirekE Feb 20 '15 at 22:51
-
1Gamut is more than coordinates on a two dimensional diagram. Gamut is the full three dimensional extent of color, it's saturation, and it's intensity. You would first need to measure...take photos of a proper test chart with sufficiently saturated and desaturated colors, of sufficiently varying degrees of intensity, and generate a full profile of the camera's gamut. That gamut could then be compared to other gamuts, or even full color spaces such as Lab*, to determine how effective it is at sensing and replicating color. – jrista Feb 21 '15 at 01:04
-
@jrista I added a link to the original answer to Munsell Color Science Laboratory Q&A where this topic is discussed. – MirekE Feb 21 '15 at 07:15
-
Well, Colorimetric Quality Factor or Gamut, whatever you choose to call it, the sensor and readout system of a camera DOES limit the range and discernment of colors within each camera. All cameras are not the same, some have considerably greater color and tonal discernment, others have less. There are many factors that play into this...native silicon response, color filter strength, etc. Render data from many cameras with the same algorithm, and these differences in hardware affect the rendering results. – jrista Feb 21 '15 at 17:06
-
1@jrista Gamut is defined for output devices, such as monitors and printers. Sensors register all colours that are presented to them. The limitations to the colour capture are described by metameric failures. – Iliah Borg Feb 21 '15 at 17:31
-
Ok then, additional question. If Bayer filter has only RGB matrix, how can it register any color that couldn't be reproduced by that very same color combo in PC display? – Cornelius Nov 16 '15 at 23:00
-
@Cornelius Due to physical/technological limitations of the light source used to reproduce the color on the PC display. – MirekE Nov 16 '15 at 23:26
-
@MirekE Yes I would suppose that, since we know they try their best to use better background lighting. But does that mean, if they would use tri-color LED backlight, with exact frequencies as the Bayer matrix is, they'd get the same gamut?
Even more than that, if they used a green LED with light that is around (x,y)=(0.1, 0.8) they'd get a lot more than 100% of AdobeRGB coverage?
– Cornelius Nov 17 '15 at 00:36 -
@Mirek OH, it seems it already exists. I'm too slow :)
http://www.pcworld.com/product/1409070/lacie-724-widescreen-lcd-monitor.html
– Cornelius Nov 17 '15 at 00:48