3

Taking in consideration the web environment only, not screen-to-print jobs, let's suppose that we have a font color e.g. #f1ecd6 Hex, displayed by a LCD, properly—by a hardware device—calibrated, using different profiles, applied one by one (e.g. 6500K/2.2 gamma, 5500K/2.2 gamma, 5000K/2.2 gamma or any other value set by our personal preferences in ideal room lighting conditions);

How can we know, that what we physically see, represent the "truly" value displayed on the screen, toggling between the above described profiles? In other words, which is the "right" color, the value displayed @6500K/2.2 gamma, 5500K/2.2 gamma or at 5000K/2.2 gamma in ideal room lighting conditions?

mattdm
  • 143,140
  • 52
  • 417
  • 741
user124853
  • 213
  • 2
  • 11
  • 2
    how do you define " ideal room lighting conditions"? – Dragos Dec 16 '15 at 19:15
  • sorry for late answer, I've exposed below why I didn't mentioned from the start. In color management, there is one single ideal set of room lighting conditions, color temperature (K) of the room lighting setup=color temperature (K) of the display profile. More than that, If the monitor doesn't match roughly the lighting conditions, then we will have a hard time because our eyes will try to adapt to a different sense of "white" all the time, which can increase eye strain and makes it harder to judge color. – user124853 Dec 17 '15 at 08:54
  • I'd love to see a source for the claim that room lighting must match monitor lighting or there is eye strain. – Paul Cezanne Dec 17 '15 at 10:26
  • 3
    Sure, here we go, please read this chapter at page 6, Color Temperature Adjustment. – user124853 Dec 17 '15 at 10:48
  • Interesting. I'm not sure I buy it. I'd prefer an academic source, not one from a company that sells monitors. High quality ones for user, but still, that's marketing information. In an office I'd suspect glare and seating position, and then brightness as far bigger contributors to eye fatigue. – Paul Cezanne Dec 18 '15 at 11:17
  • 1
    @PaulCezanne imho this will apply no matter what display brand we have in discussion. another source (please read Selecting Bias Light chapter); also here another source, pretty serious. 99,9%, there is nothing commercial in this. – user124853 Dec 18 '15 at 12:54
  • 1
    There is a reason there are ISO standards for viewing conditions: Because they affect our perception of that which we are viewing. – Michael C Dec 18 '15 at 21:40

3 Answers3

3

If the display is calibrated correctly and you are using a profile correctly you can assume the colour is correctly displayed, or more precisely, that it is displayed as accurately as the system can manage. But without dragging out a lot of more expensive laboratory equipment to double check the calibration and profile are effective, there is nothing you can do.

Also note that human vision ( eye + brain ) is very personal. We all see colour slightly ( or even very ) differently, so "right" is also not a very easy thing to define outside of a very narrow scientific context.

In the context of producing an image for human consumption I would also say that I aim for pleasing colour not accurate colour. Accurate colour portraits are often not well received by their subjects, in my experience. Colour balance is often used to enhance the emotional response of the viewer.

Put it this way : do you photograph for your own technical gratification or for the viewer's gratification ? Is it about emotional content of the shot or technical precision ? I'm not saying one is right or one is wrong - that's a personal thing. I am suggesting a different viewpoint from a purely technical one.

StephenG - Help Ukraine
  • 6,385
  • 1
  • 15
  • 28
  • In photography, I guess that less or more there could be an accepted variation due to the relative viewing conditions of the each user. In web design for example, when we are reffering to a hex color (well defined) I suppose that there should be only one (?) "true" color displayed on screen. That's the reason of starting this thread :) – user124853 Dec 17 '15 at 08:36
  • In practice displays have a wide variation in colour response and even if they start out at the same response from the factory, these quickly get changed by users ( brightness, contrast, saturation ) and drift out of calibration randomly. So you're unlikely ever to get a reliable colour performance from web use. Ballpark reasonably close is as much as is worth doing ( and I do software engineering, BTW ). IMO. YMMV. – StephenG - Help Ukraine Dec 17 '15 at 10:08
3

in ideal room lighting conditions?

"Ideal" is probaby all the lights turned off. So the white then becomes the pure white of your white screen. Another ideal is probably a light and gray walls that match your same white balance on your monitor. Which again depends on your monitor settings.

The question could be "What is an ideal White point", which again is relative, specifically to our Sun. In this page https://en.wikipedia.org/wiki/Color_temperature it says:

"The effective temperature, defined by the total radiative power per square unit, is about 5,780 K.[5] The color temperature of sunlight above the atmosphere is about 5,900 K.[6]"

This middle day sunlight is afected also by the blue cast of the atmosphere, which adds some blue, or some more degrees Kelvin.

But you probably need a more standarized white point:

"Daylight has a spectrum similar to that of a black body with a correlated color temperature of 6,500 K (D65 viewing standard) or 5,500 K (daylight-balanced photographic film standard)."

I would probably vote to use the D65 standard.

"Digital cameras, web graphics, DVDs, etc., are normally designed for a 6,500 K color temperature. The sRGB standard commonly used for images on the Internet stipulates (among other things) a 6,500 K display whitepoint."

Rafael
  • 24,760
  • 1
  • 43
  • 81
  • This could be the answer that I'm looking for. In other words are you trying to say that a color value (e.g. the hex mentioned above) in ideal conditions—room lights off OR lighting setup that correspond in color temperature with display color temperature (K)—could have different "right" seen values? at 6500K with our optical system eye+brain will see a value, and @5500K which is also found into the daylight spectrum, another "right" color value, in the end the final pofile choosing option depending of the workflow purpose (general, web OR print, prepress)? – user124853 Dec 17 '15 at 08:39
  • Your eyes adapt to diferent temperatures, yes, so for example 5500K can be right. But also the most standarized temperature is 65000K, so In my opinion go for that one. – Rafael Dec 17 '15 at 13:12
2

Your question leaves out the most important part of the entire equation. It is the reason we do color management. Which setting allows the viewer to perceive the most accurate color?

You refer to "ideal room lighting conditions" without specifying exactly what you consider to be ideal. And there's the crunch: a specific set of viewing conditions means we will perceive the same color produced on the screen differently than when viewed under different viewing conditions.

If the ambient light in the room is balanced at 6500K, then 6500K is the most appropriate choice for your screen's calibration. If the ambient light in the room is balanced for 5500K, then 5500K is the most appropriate choice for your screen's calibration. If the ambient light in the room is balanced at 5000K, then 5000K is the most appropriate choice for your screen's calibration. And so on...

The first step should always be measuring the intensity and temperature of the ambient light falling on the screen and the surrounding field of view when the viewer is observing the screen. The ISO standard for viewing prints is at D50 (full spectrum centered on 5000K) at about 2,000 lux. See this answer for more on that. For screen display the generally accepted standard is D65 (broad spectrum light centered on 6500K), but that assumes you have managed the viewing condition to match.

Michael C
  • 175,039
  • 10
  • 209
  • 561
  • Thank you for your feed-back. In color management, there is one single ideal set of room lighting conditions, described above by you, I didn't mentioned in order to not distract the attention from the main subject of the question. I've read about it before but thank you again for mentioning. Having said that, let's suppose that we have the same display, calibrated (and corelated to the ideal lighting conditions) using the described profiles one by one. As observer, which could be considered the right color ? :) – user124853 Dec 17 '15 at 08:18
  • You still haven't defined Ideal! Is it defined as a viewing environment of full spectrum light centered on 6500K at an acceptable brightness? For all three profiles mentioned in your question? Or is "ideal" 6500 K only for D65, while "ideal" for the D50 profile is a viewing environment of full spectrum light centered on 5000K at an acceptable brightness? And "ideal" for your 5500K profile is a viewing environment of full spectrum light centered on 5500K at an acceptable brightness? – Michael C Dec 18 '15 at 21:47
  • Defining ideal is absolutely necessary to properly answering the question. Without defining the current viewing conditions there is no way to properly determine which profile is most correct. – Michael C Dec 18 '15 at 21:48
  • 1
    In the viewing environment centered on 5000K, the D50 profile will display colors most correctly. In a viewing environment centered on 5500K, the 5500K profile will display the colors most correctly. In a viewing environment that complies with D65, the D65 profile will display the colors most correctly. In all three cases the same numerical color should be perceived as the same color! – Michael C Dec 18 '15 at 21:51
  • yes, color temperature (K) of the room lighting setup=color temperature (K) of the display profile, brightness by preferences. I've mentioned into a comment above and Rafael gave me an answer. Thank you, – user124853 Dec 19 '15 at 10:08
  • So what was the original point of your question? What problem were you trying to solve? I'm still trying to understand how you believe you can select the most correct monitor profile without first defining/measuring the viewing conditions. And "Ideal viewing conditions" is never with all of the lights all the way off. There is no ISO standard for such viewing conditions. – Michael C Dec 19 '15 at 15:50
  • I was concern about the way a specific color is reproduced on screen, using different profiles like 5000K, 5500K, 6500K, 2.2 gamma, in ideal lighting conditions (I don't think that someone will watch/edit photos with the lights off, this condition was set in order to emphasize the fact that in my question we agree that we already have ideal lighting conditions). As long as at least for web there is no standard color temperature and it's up to the viewer which settings will use, there is no "right" color. – user124853 Dec 19 '15 at 17:18
  • I also understand that using 6500K, there is no guarantee that is the most wonderful decision and the color will be "true" but being the most used display color temperature by regular users and/or web designers/photographers in order to reach the "target" audience, at least for the moment (untill a web standard will be implemented) it could be a solution—of course, excepting the case when we have to do a screen-to-print job. – user124853 Dec 19 '15 at 17:29
  • 1
    But the point is, if you are viewing it on a 6500K calibrated monitor in 6500K light the color #f1ecd6 will be perceived the same as if you were viewing color #f1ecd6 on on a 5000K calibrated monitor in 5000K light. Therefore it doesn't matter if you edit an image on a 6500K monitor (under 6500K lighting) and the viewer sees it on a 5000K monitor. As long as the lighting conditions match the monitor calibration in both instances, and the color is inside the color gamut of both monitors, the perceived color will be the same. – Michael C Apr 04 '16 at 11:10
  • good to know, thanks. How about Gamma value, do you agree with this answer? In other words, not so sure which value of gamma to choose between native or other gamma value (using 1D LUT which will cut the colour depth); – user124853 Apr 04 '16 at 11:56
  • 1
    Yes. It is always best practice to calibrate the monitor itself using internal controls as close to the target as possible before creating a software profile to zero it in. When you use a colorimeter, one of the first steps is usually setting the brightness and contrast using the monitor's controls. This has an effect on the monitor's gamma response. – Michael C Apr 05 '16 at 06:33
  • true but in these conditions assuming that we already used the internal controls as close to the target as possible and we want to create a profile in your opinion is it good to choose native (without 1D LUT) or let's say RGB's gamma value 2.2? My concern is related to the fact that applying 1D LUT will cut the colour depth according to that answer and other source that I've found. On the other hand, using native gamma 2.43 measured in my case, it's not sRGB's gamma value; on other monitors that use sRGB in the same ambiental conditions, the result will be different. (?) – user124853 Apr 05 '16 at 07:04
  • 1
    The point is, when you've altered the monitor's brightness and contrast levels through the use of the monitor's own controls, you should have effectively altered the "native" gamma value of the monitor to a degree. What measuring device and software are you using to measure your monitor's native gamma value and build your monitor profile? – Michael C Apr 05 '16 at 07:12
  • 1
    "On the other hand, using native gamma 2.43 measured in my case, it's not sRGB's gamma value; on other monitors that use sRGB in the same ambiental (sic) conditions, the result will be different." So what about monitors that don't have the same color depth? You're never going to have exact matches, not even to other properly calibrated and profiled monitors with a different gamut. Also, anytime you apply a profile it can only reduce color depth compared to running without a profile. A profile can not increase a monitor's response to more than 100% of the response that the monitor is capable. – Michael C Apr 05 '16 at 07:18
  • I am using Colormunki Display device + DisplayCAL (formerly known as dispcalGUI). I understand that you support the idea of using the 1D LUT. – user124853 Apr 05 '16 at 10:10
  • I'm trying to learn what the difference between a 1D LUT and a 3D LUT is. I've always just used LUT. – Michael C Apr 06 '16 at 06:07