20

Blue Green Red (BGR) byte ordering exists in a few image processing libraries "for historical reasons", mostly because it seems to have been a Microsoft standard.

Does anyone know the historical reason?

Color spectrum is almost always read as "Red Green Blue" in English.

I can't think of any byte ordering argument that would favor one over the other. OpenGL and earlier HW graphics tended to be either ARGB or RGBA (iirc). Intel little-endian doesn't seem to make any difference in a 4byte structure

The only blue first reason I can think of is that broadcast uses YUV and U is blue-green and V is red-green.

Ola Ström
  • 327
  • 1
  • 2
  • 9
Martin Beckett
  • 423
  • 3
  • 9
  • 7
    "Intel little-endian doesn't seem to make any difference in a 4byte structure" <-- please elaborate. In little-endian, the least significant byte comes first while in big-endian the most significant byte comes first. – snips-n-snails Jun 01 '17 at 19:11
  • @traal but it doesn't give any particular advantage to reading RGB vs BGR – Martin Beckett Jun 01 '17 at 19:43
  • 9
    If byte 0 is red, byte 1 is green, and byte 2 is blue, then the little-endian four-byte word starting at address 0 contains bytes in the order: xBGR. – Tommy Jun 01 '17 at 20:01
  • 2
    @MartinBeckett You're implying that BGR is stranger than RGB. I'd recommend changing the title to something like "Why was RGB used sometimes and BGR used other times?" (although better worded, obviously). – wizzwizz4 Jun 01 '17 at 21:43
  • 2
    @wizzwizz4: One is a worldwide standard. The other is introduced later in a small range of applications and offers no advantages whatsoever, creating incompatibilities instead. Why would you claim the other is not stranger? If you look at the formats alone, yes, they are equivalent, but if you take the broader context of the IT community, this is like an odd kid who decides to use "Se" instead of "He" as male pronoun. Discussing if "Se" is equally good a word as "He" as a pronoun is besides the topic. – SF. Jun 02 '17 at 08:01
  • 1
    Also, I've double checked to make sure it wasn't my imagination but the iPhone uses BGR, to the extent that if you include a PNG in an iPhone project and permit the action, Xcode will switch the channel order at build time, embedding a suitably marked but technically non-standards-compliant version of the PNG in your app. Which it can then load slightly more quickly. This behaviour goes all the way back to the original model, when I guess it will have made more of a difference. – Tommy Jun 02 '17 at 15:36
  • 4
    @MartinBeckett If something reads pixel data as 32-bit integers and retrieves red, green, and blue component values by masking and shifting, then endianness does make a difference. If the underlying pixel data were stored in RGBA order, then C code on little-endian machines would be written to do masks and shifts using ABGR. – jamesdlin Jun 03 '17 at 05:48
  • Fun side story... Not all early 24-bit video cards used the same color sequence either. I remember using an ATI card with some software that didn't recognize it. I found a driver eventually was compatible enough to get into the right mode, but the colors were all wrong because the color info was stored in a different order in memory... – Brian Knoblauch Aug 15 '18 at 16:04

1 Answers1

29

The ordering for the color values stems from a desire to store color palettes in memory in a way that is easily transferable to the VGA RAMDAC used in IBM (and compatible) VGA video cards.

The original IBM VGA implementation made use of the INMOS 171/176 RAMDAC chip, which is used as the Color-Lookup Table (CLUT) for VGA color mode displaying up to 256 colors from an 18-bit color palette. This RAMDAC had a peculiar way of reading and writing color palette entries using its 8-bit wide data bus. Three bus cycles are required to read or write the 18-bit color palette entry one byte at a time. During each access, an internal address register value is incremented so that the cycles read/write Red, Green, then Blue in that order. The internal register resets after blue.

Thus, it becomes an issue of convenience. By storing the palette values in the CPU RAM in xBGR order, the individual byte values end up in the proper sequence for the RAMDAC when they are transferred to it, byte-by-byte, using it's 8-bit data bus.

This data sheet for a compatible RAMDAC from Analog Devices gives more detail on how palette reads/writes are performed with these chips.

Brian H
  • 60,767
  • 20
  • 200
  • 362
  • 3
    The RAMDAC has three output wires associated with the first, second, and third color component. Although they are arbitrarily labeled as red, green, and blue, the chip has no reason to know nor cares which wire is connected to the red, green, or blue signal of a monitor, or whether they're wired to three different monochrome monitors used to show somewhat different things (such a usage scenario would make sense mainly if most content would be the same on all three monitors, but one wanted to map 8 of the 256 colors to different combinations of black/white so one could... – supercat Jun 13 '17 at 22:37
  • ...have a few pieces of information that would only show up one one screen or the other. – supercat Jun 13 '17 at 22:38
  • 2
    The datasheet for the 171 clearly labels the pins as RED, GREEN and BLUE, in that order by pin number. So you could certainly have used them for whatever you wanted since the behavior was the same, I'd expect that everybody who used that RAMDAC followed the convention established by teh manufacturer for channel ordering. – wrosecrans May 25 '20 at 20:43
  • 1
    This answer makes no sense at all. As you said, the VGA palette is written in the order RGBRGBRGB..., so why is the palette stored as BGRxBGRxBGRx... in the BITMAPINFO structure? – benrg Oct 13 '21 at 18:44