1

When I merge my brackets with Lightroom it's always 8-bit and in HDR Efx Pro 2 it's also in 8- or 16-bit.

When I use PS to merge brackets I can convert it to 32-bit but then I can't post-process it the way I want.

I don't want my HDR pictures to look all funky and dingy. I've been watching a lot of tutorials on HDR but they don't mention anything about file types (TIFF, RAW, etc.) and what bit depth they use.

Even all the RAW files are in 8-bit. Is that the reason why I can't work in 32-bit? Am I missing something here?

Rafael
  • 24,760
  • 1
  • 43
  • 81
Chris
  • 243
  • 3
  • 9
  • While in lightroom, nothing is 8-bit, at least not that you'd know. When you export an image, you set the format yourself in the export dialog. – ths Dec 20 '16 at 21:31

1 Answers1

2

You are probably missing some points.

HDR first of all is a concept, having more dynamic range.

Second, is about the ammount of information. A real HDRI file has a floating point information, that is an optimized way to store huge diference in values. It is more effieient than defining for example 16 million possible values.

But we need to transform that "RAW" information into a more usable one. This is what tonnal map does. 16 bit image has plenty room to adjust and play a visible and reproducable image.

RAW is not a file type by itself, it is a type of unprocessed information.

A TIF file can store diferent deepth of pixel information that can be readed by diferent programs Normally an usable tiff file can store 8 or 16 bits per channel (24, 32, 48 total bits).

32 bits per channel is several billions of levels, so it is quite complex to handle. (Do not confuse this to 32 bits total, which is a CMYK image)

But if you work in photoshop the best option is to use the native file format. PSD.


A normal monitor displays only 256 levels of each color, 8 bit, potentially displaying 16 million diferent shades. (24 bit images)

Some monitors have a hughe contrast, for example 10000:1, but this is how diferent the shade 0 is from the shade 256. This not mean that displays 10,000 diferent tones, it is how separated the tones are.

Some monitors mark the display as 32 bit, instead of 24. This is in reality used by the Operating system to indicate it can handle transparent objects rendered, for example a transparent window when you move it. This 3d are not per channel, are 8 bit for each RGB channel (24, plus transparency 24+8=32).

You can not "scale down" a bit depth, you simply reduce it.

But again, do not confuse X bit images to X bit per channel


I can change my monitor from 16 to 32bit

This is total bit depth, not per channel. a 16 bit monitor is total bits, so it can only display 65,000 diferent colors, so you will see banding. Leave your monitor to 24 or 32 bits.

Rafael
  • 24,760
  • 1
  • 43
  • 81
  • I'm not completely illiterate in terms of computers, but I hear people say that our monitors can't show a real 32bit file but then they're processing it in PS in 32bit. I want to refrain from programs like Photomatix or HDR Efx and work with layer masks or luminosity masks in PS. In order to get the best quality out of my pictures, I need to know more about what file type to use so that I can print a nice big picture to put on the wall. If I printed my current pictures, they would look all fuzzy and noisy. – Chris Dec 20 '16 at 20:36
  • As I comented, a 16 bit image has enough room. You have 65,000 levels to play, and your final image can be of 8 bit, or 256 colors. Take a look at this: http://photo.stackexchange.com/questions/72116/whats-the-point-of-capturing-14-bit-images-and-editing-on-8-bit-monitors/72121#72121 - The fuzzy could be due other thing, not bit depth. Probably resolution or focus. – Rafael Dec 20 '16 at 20:47
  • Could have something to do with the size as well? If I resized it to fit an actual standard monitor, would it appear less – Chris Dec 20 '16 at 21:00
  • *less fuzzy? Sorry for the typo. – Chris Dec 20 '16 at 21:01
  • What I also don't understand is why I can change my monitor from 16 to 32bit but people say you can't watch images on a normal monitor that were rendered in 32 bit. The guy in the other thread said he scaled his pictures down to 8bit to fit all monitors. If I search for 32bit monitors I get simple LCD monitors. – Chris Dec 20 '16 at 21:09
  • I complemented a bit my answer. (Do not confuse a bit with digital bit) n_n – Rafael Dec 20 '16 at 21:38
  • @Chris realize that you're not changing the number of bits in the monitor, you're changing the number of bits in the display adapter. And I've never figured out what they use the extra 8 bits for, since the majority of monitors can't display more than 24 total bits. – Mark Ransom Dec 20 '16 at 21:50
  • @Mark Ransom Then I'm really missing something here. My graphics card and monitor can both display 32bit and I haven't seen one monitor or graphics card that can't live up to that nowadays. So I should edit it in 16 or even 8 bit and then leave it at that? – Chris Dec 20 '16 at 22:07
  • @Chris what's your monitor model? I'll double-check for myself. Just because it accepts 32 bit doesn't mean it uses more than 24. – Mark Ransom Dec 20 '16 at 22:15
  • @MarkRansom I double checked and it can only display 24bit. Why can I see the changes in a 32bit picture then? A 32bit image created by PS always looks smoother than a 16bit image. What if I looked at a 32bit image on a 16bit monitor? What would change? – Chris Dec 20 '16 at 22:23
  • @Chris there are no 16 bit monitors. A 16 bit image (16 bits total, not per color channel) is very limited in color and will definitely be inferior to 24 or 32 bits. – Mark Ransom Dec 20 '16 at 22:31
  • @MarkRansom So what is the difference between per channel and per color? I'm editing in 16bit in LS,HDR Efx or Photomatix. Does that mean it's 16bit per channel? – Chris Dec 20 '16 at 22:48
  • 1
    @Chris I don't know those applications so I can't be sure, but I think they all are referring to 16 bits per color channel. The only exception might be Photoshop, and I'm not sure about it either. – Mark Ransom Dec 20 '16 at 22:53
  • @MarkRansom, yeap there are 16bit color display on some legacy monitors, but again that is only High Color, not true color. Chris, read my editions to the answer. – Rafael Dec 20 '16 at 23:03
  • @Rafael there are some monitors that can handle 10 bits per channel (30 total) but I'm not sure how they interact with the display adapter. That whole area is one where I've wanted to educate myself, but the information is hard to come by. Combine that with color management and it becomes a huge mess. – Mark Ransom Dec 20 '16 at 23:08