29

In the MS-DOS Editor, the only choices for colors were a collection of 16 colors:

The QBasic-based MS-DOS Editor 1.1, with the ‘Display’ dialog open, showing color customisation options.

That's 16 colors:

  • Black
  • Blue
  • Green
  • Cyan
  • Red
  • Magenta
  • Brown
  • White
  • Gray
  • Bright Blue
  • Bright Green
  • Bright Cyan
  • Bright Red
  • Pink
  • Yellow
  • Bright White

How were these colors chosen, and why were there only 16?

lvd
  • 10,382
  • 24
  • 62
no ai please
  • 1,123
  • 1
  • 8
  • 25

5 Answers5

70

The original IBM Color Graphics Adapter (CGA) for the first IBM PC introduced the "80x25 at 16 colors" text display mode for use with output to color monitors like the IBM 5153 (as opposed to output to televisions, where you'd want the 40-column mode). All later color graphics adapters (EGA, VGA, etc.) provide compatibility with that mode and that's what MS-DOS Editor runs in as a common baseline.

As for why 16 colors, it's because RAM was very expensive. 4-bit color gives you 16 colors and lets you pack a foreground and a background color into a single byte so that each character cell is only two bytes of screen memory and it only takes 4000 bytes to represent the whole screen.)

(One bit to turn on the red electron gun, one bit for green, one bit for blue, and one bit to control whether they should be at low or high intensity. RGBI. Then, the character ROM installed on the video card is used to translate those into grids of pixels for each character.)


Edit: The CGA palette with hex equivalents:


(Source: Wikipedia)

ashleedawg
  • 105
  • 4
ssokolow
  • 6,746
  • 2
  • 25
  • 40
  • "and one bit to control whether they should be at low or high intensity" - oh, that's interesting. So there were only 3 brightness levels for each channel (zero, medium, high) , and this is how the colors were "chosen" - those are the possible combinations. E.g. Black (0000), Green (0100), BrGreen (0101). But, how did they get Black (0000), Gray (1110), White (1111), BrWhite (????)? – Filip Milovanović Jun 07 '21 at 07:22
  • Or was it Gray (0001), with this particular value getting special treatment? – Filip Milovanović Jun 07 '21 at 07:23
  • 2
    You could add a color wheel or a line explaining how RGB are the primary components for additive color mixing, while CMY are the inverse / negative mixing ones - so the "choice" (which as you explained wasn't really a choice but coming from having 1 bit per color) relates to how color works in general, not just in computers. Anyways, fine answer! – AnoE Jun 07 '21 at 07:24
  • 13
    Black: 0000. Gray: 0001. White: 1110. BrWhite: 1111. – john_e Jun 07 '21 at 07:30
  • 12
    Well actually, "I" (intensity) is the high bit, then R,G,B; but otherwise you have it correct for the case of MDA and 16-color CGA/EGA. – snips-n-snails Jun 07 '21 at 07:39
  • 6
    Agree it's mainly about video memory - but it's also about number of wires to the CRT and whether you want to invest into a DAC (or, generally, analog video circuitry) in the computer or not. – tofro Jun 07 '21 at 09:27
  • @AnoE If you want to be really pedantic, RGB and CMYK are convenient approximations based on how human eyesight works. It's actually possible to create more accurate models of the colours we can perceive, quite apart from the possibility that some women are actually tetrachromats. More relevantly, a 16-colour palette won't necessarily be evenly spaced within a particular colour space, but mapped to output intensities with a lookup table somewhere. – IMSoP Jun 07 '21 at 11:46
  • 2
    If memory serves, the high bit adds half-intensity on the "off" guns and yellow is the special case. Notice how Magenta + Bright -> Pink. – Joshua Jun 07 '21 at 14:09
  • 10
    RAM was very expensive compared to now. In the mid-'70s I bought an 8k x 8 (S-100 bus) memory board for around $250. $0.0038/bit, $0.0305/byte. – Technophile Jun 07 '21 at 15:31
  • 1
    @tofro Given that CGA cards support composite out, and the previous generation of home computers meant to target an even more price-conscious market segment had RF modulators, I think the RAM is more significant to the design than it being digital. – ssokolow Jun 08 '21 at 13:34
  • @ssokolow I think that is what I was saying ;) – tofro Jun 08 '21 at 13:45
  • @tofro Sorry. I'd just woken up, so, by the time I'd finished "adding diplomacy" to my comment, I'd forgotten the details of what I was replying to and didn't think to double-check. My original point was "I very much doubt that, if the previous generation could afford RF modulated analog output on bottom-bracket computers, dropping the modulator circuitry and bumping the column count up to 80 would cause the price to shoot up enough to be relevant in their calculations." (Or, in short, "RAM was so expensive, I don't think they even got to thinking about that.") – ssokolow Jun 08 '21 at 15:40
  • 11
    "One bit to blue them all, one bit to green them, one bit to bring red all, and then with brightness blind them" - from "Lord Of The Bits". The original seems to have been mistranslated... – Bob Jarvis - Слава Україні Jun 09 '21 at 03:02
  • @Technophile - "...and we liked it that way! We LOVED it!!" :-) – Bob Jarvis - Слава Україні Jun 09 '21 at 03:04
  • @ssokolow Well, all sorts of cost-limiting factors were prevalent in 80ie's computers - not just RAM. The Sinclair QL, for example, had 8 colours instead of 16 because they ran out of pins on a custom chip and couldn't spend one for intensity - and used the remaining bit in video RAM (it would have had enough for 16 colours) for flashing pixels, because that didn't need that extra pin. – tofro Jun 09 '21 at 08:04
  • CGA could actually output to televisions, but it needed an external modulator. as the card only had Y anc C outputs for this putpose). – Jasen Jun 09 '21 at 12:39
  • @Jasen: Ironically, while the CGA had an 80-column color text mode, the colorburst timing in that mode was wrong and thus it would generally show mangled black and white text unless the border color was set. Because the border color was shown in the part of each scan line where colorburst should have been, setting the border color to yellow would make color text kinda sorta almost work, but for the most part the 80-column color text mode was broken in all cases where it didn't behave like the "black and white" mode. – supercat Jun 09 '21 at 20:02
  • @Jasen Actually the CGA did not have separate Y and C outputs, but single composite video output. It needed a modulator for connecting to RF antenna input, but could also be connected directly to a device with composite input. – Justme Jun 09 '21 at 20:05
  • That's possibly true, the one I encountered was a clone, and we don't do NTSC here so the best I could get was a monochrome image anyway. – Jasen Jun 09 '21 at 20:09
26

How were the colors selected kind of depends on why there are only 16 of them.

In short, a CGA monitor takes four bit RGBI color input which means 16 colors. Each RGB color bit turns an electron gun for that color and I bit adds intensity to all of the guns, and brown color is handled with an exception.

A color monitor has three electron guns for three color phosphors, and those color are red, green and blue. So if each color is simply turned on or off, you need three bits to control the electron guns, which allows for 8 colors. Black color is all guns off, white is all guns on. The three colors with single gun on are red, green and blue. The three colors with two guns on are cyan, magenta and yellow.

For each character cell in the color text mode, the CGA reserves one full byte for character code and one full byte for 8-bit character attributes.

If there had been 2 color bits per gun in the monitor (which is what EGA does), it would use 6 bits for 64 colors - too much to store. It would also make sense to still have more colors than 8, so the attribute byte was used to have 4 bits of foreground color and 4 bits for background color. This then left one bit for each color gun and one extra bit for intensity, so it was enough to have 4 colour bits allowing for 16 different colors. So three bits control the guns separately and one bit adds brightness to all of them.

There is one special mechanism to alter the color palette in the monitor. There is bright yellow, but no dark yellow. The bit pattern for dark yellow is special and the analog voltages for driving the electron guns are altered to produce brown instead. This basically means that the DAC or color lookup table to convert bit patterns of color values to analog video voltages is in the CGA monitor that takes digital 4-bit RGBI input.

Another special mechanism was implemented in the CGA card which affects text mode background color selection. The fourth background attribute bit functionality is selectable. By default it controls if the foreground color blinks or not, so only the 8 dark background colors are available for selection. It can be changed to control the background intensity bit, which allows for selecting all 16 background colors, but then blinking text is not possible.

Explanation how the color order is determined from RGBI bits controlling the electron guns:

X = IRGB bits = color
0 = 0000 = Black
1 = 0001 = Blue (Dark)
2 = 0010 = Green (Dark)
3 = 0011 = Cyan (Dark)
4 = 0100 = Red (Dark)
5 = 0101 = Magenta (Dark)
6 = 0110 = Brown (actually, Dark Yellow which is adjusted to Brown in monitor)
7 = 0111 = White (actually, Dark White, Gray, Bright Gray, Light Gray)
8 = 1000 = Gray (actually, Dark Gray, Bright Black, Intensity Black)
9 = 1001 = Blue (Bright)
A = 1010 = Green (Bright)
B = 1011 = Cyan (Bright)
C = 1100 = Red (Bright)
D = 1101 = Magenta (Bright, or Pink)
E = 1110 = Yellow (Bright)
F = 1111 = White (Bright)
Justme
  • 31,506
  • 1
  • 73
  • 145
13

Assuming PC and CGA/EGA/VGA graphics (based on your example image)

As mentioned in the other answer, colors require memory which was not cheap back then. Also more memory for rendered VRAM means you need faster CPU processing and memory bandwidth. So all boils down to find a compromise between:

  1. screen resolution
  2. color depth
  3. frame rate

based on biological (what we can see and what is acceptable/enough) and technical aspects and limitations (costs, speed limits, reliability, adhering to standards and compatibility).

Your example shows a text mode which uses 16 bits per character. Where 8 bits are the extended ASCII of displayed character and 8 bits per 2 colors and 1 blinking flag.

  • 8 ASCII
  • 1 flash (blinking on/off)
  • 3 paper (8 colors)
  • 4 ink (16 colors, where the highest bit is brightness)

Now, in 4 bits we can encode 16 colors (choice was made 16 colors was "enough"). The graphics modes use 4 bits per pixel instead (also 16 colors). The standard 16 color palette was carefully chosen so its dithering friendly meaning you can use dithering without adding too much unnecessary noise and also provides some basic colors for non dithered purposes.

Similarly, once VGA introduced 256 colors, the standard 256 VGA color palette was also dithering friendly, allowing easier view of true color images on 8bpp.

Of course you can change the palette (on EGA/VGA) to any colors (that is how the plasma and some animation effects where done) the colors DACs where 6 bits so you have 26+6+6 = 262144 colors at your disposal, however at once you can choose only 16 or 256, depending on the video mode.

Text modes where not as heavy on memory size and bandwidth as you need much less VRAM to store whole screen (that is why they usually had bigger resolutions) so the real limitation was from graphics modes. Let assume 640×480 resolution 4bpp (16 colors) and 60Hz refresh rate meaning you need:

  • VRAM: 640 × 480 × 4 / 8 = 153600 B = 150 KiB
  • Bandwidth: 60 × 640 × 480 × 4 / 8 = 9216000 Byte/s = ~8.78 MiB/s

Which was really a lot to handle for old computers like 8086, 80186, 286 already, but doable especially when skipping frames. Now assume 8bpp (256 colors):

  • VRAM: 640 × 480 × 8 / 8 = 307200 B = 300 KiB
  • Bandwidth: 60 × 640 × 480 × 8 / 8 = 18432000 B/s = ~17.58 MiB/s

I do not think 80286 could handle that fully, but 80386 could. That is one of the reasons why original VGA did have just 320×200×8bpp and bigger resolutions were still 1bpp or 4bpp until SVGA/VBE/VESA kicked in much latter on. Another reason was that higher resolutions would not fit into 256 KiB (640×480 is 300 KiB) which means even adding few more kilobytes would require to add more addressing lines and decoders...

Also why not chose 5bpp or 6bpp instead of 4bpp?

Because 4bpp nicely divides byte into 2 nybbles and CPUs have instructions handling those already allowing easier programming and faster code for graphics processing.

user3840170
  • 23,072
  • 4
  • 91
  • 150
Spektre
  • 7,278
  • 16
  • 33
  • 3
    I wouldn’t say the CGA/EGA palette is very dithering-friendly; CGA barely had any graphics modes, especially ones where dithering would look anywhere near acceptable. The component values of the ‘brown’ colour are probably the biggest confounder here. The Windows 4-bit palette, on the other hand, was specifically designed for dithering. – user3840170 Jun 07 '21 at 08:53
  • 5
    The standard sixteen-color palette was "chosen" in such a way that many monitors could convert an RGBI color into analog RGB voltage using nine resistors, though some monitors would include an extra circuit to reduce the amount of green in the non-intense red+green color compared to the other non-intense colors that included green. – supercat Jun 07 '21 at 16:53
  • @supercat Any pointers how would digital RGBI be converted to analog RGB using 9 resistors? The IBM 5153 CGA monitor is a lot more complex than that. – Justme Jun 07 '21 at 20:31
  • @Justme: If full-scale white is supposed to be about 0.7 volts, then wire the analog red input to the digital red via 1K resistor, digital intensity via 2K resistor, and ground via 100 ohm resistor. Do likewise for analog green and analog blue, but substituting digital green and digital blue (share digital intensity for all three). – supercat Jun 07 '21 at 20:43
  • 1
    @supercat I do see your point at a block diagram level, but in practice, directly driving a resistor DAC over a cable with LS TTL signals would be problematic. For example, the IBM 5153 CGA monitor first uses a digital buffer for the RGBI inputs, and from that point on, the buffered digital signals are used to control high speed analogue transistor circuitry to reach the required bandwidth. – Justme Jun 07 '21 at 21:42
  • @Justme: Stick a quad or hex buffer chip on the input stage if you like; then it's one commonplace chip along with nine resistors. – supercat Jun 07 '21 at 22:23
  • @Justme: BTW, I wonder how the circuitry the CGA used to generate medium-resolution graphics mode colors compared in price to a 74LS670 4x4 register file, which would have made it easy for code to configure all four colors separately (wire D0-D3, WA, and WB to the system data bus D0-D5), and ensure that /GW is only asserted while the data bus is valid, and that chip could replace the 6-bit register that sits at that address as well as the downstream circuitry). – supercat Jun 07 '21 at 22:40
  • 1
    Memory bandwidth is a key point that the other answers have missed: the CGA had around 50us to fetch an entire 160 bytes to display each row of the screen, meaning it had only a little over 300ns to fetch each byte, which was quite close to the best you could achieve with reasonably-priced DRAM chips at the time it was designed. – occipita Jan 03 '23 at 16:53
  • (... although having just looked at a picture, it seems that the CGA used 4517 chips, which were faster than the 4116s most other similar cards used, so maybe memory bandwidth wasn't that much of an issue). – occipita Jan 04 '23 at 13:32
5

In addition to the other reasons people have given, enabling those sixteen colors allowed the IBM PC (with an appropriate terminal emulator) to be compatible with the ECMA-48 standard of the late ’70s, also known as ANSI escape sequences or ANSI terminal codes.

This enabled a PC to connect to a server by modem or serial cable, and display screens written for a terminal such as the DEC VT series or IBM’s mainframe terminals. There was a device driver to enable ANSI terminal codes to work in MS-DOS, named ANSI.SYS. This was important to any PCs that connected to mainframes or time-sharing systems, especially those running UNIX or VMS, and Infocom’s text adventures were among the native software that used it.

user3840170
  • 23,072
  • 4
  • 91
  • 150
Davislor
  • 8,686
  • 1
  • 28
  • 34
0

I'm going to disagree with the other answers here. The reason the PC only supported 16 colours at launch is that this was considered good enough at the time.

The IBM PC launched in 1981 with two choices of graphics adapter: the monochrome text only MDA and the color/pixel graphic CGA.

The competition at the time was very limited. The PC was clearly intended to compete in the space occupied by both CP/M-based systems, which typically only had monochrome text displays, and the Apple II along with other similar systems, which only had limited support for colour graphics and text (e.g. the Apple II supported 16 colour very low resolution graphics or text and 6 colour 140x192 pixel graphics). CGA was the superior option.

Memory limitations did not prevent the CGA from being made better. Memory space wasn't an issue: the CGA was supplied with 16KiB of high-speed onboard DRAM (additionally to the PC's base system memory), so could easily have supported a 256-colour text mode.

Designing a board that supported this would have been more expensive. It would have required additional support circuitry and made the board larger, so it may not have been able to fit into the PC case (which was designed to be a similar size to existing systems). These are reasons why if the PC design team had considered the option they may have decided not to. But I don't imagine they ever once considered it -- 16 colours was enough to satisfy the market at that point in time, so that is what they designed.

occipita
  • 2,347
  • 1
  • 9
  • 20
  • If you say it could have had 256 colors, it is true, but it might as well had used 24-bit colours, as for each character you have enough memory to select a 24-bit foreground and a 24-bit background color - but it didn't. If it did have 256 colors, then how would they be selected, and would the interface be analog or digital to monitor? It would have made the NTSC composite output difficult as well. 4-bit RGBI with digital color data and a DAC in the monitor was indeed good enough. It just went to 6 bit colors in the EGA and MCGA/VGA 256-colors with 18-bit DACs onboard to drive analog monitor. – Justme Jan 04 '23 at 17:15
  • @Justme 24-bit foreground/background would require 7 bytes per character. This means that in the ~50us available during the visible portion of each display line being produced, it would have to fetch 540 bytes to display 80 columns, thus <100ns would be available per byte. This was not achievable with low-cost DRAM in 1981, to get this kind of performance high performance SRAM would have been necessary, putting up the cost of manufacturing the board by around $100 – occipita Jan 20 '23 at 18:07