17

Early PCs generated RF signal, and later Composite video or S-video, to use a TV set as monitor. Why didn't color TVs of those days expose a analog RGB interface for direct connection from VCR/PC or any other local device? There must be some stage in the TV where chroma-luminance signal has already been decoded into RGB signal and can be hacked into without costing too much.

A not-so-retro modding example:

Link:

Toby Speight
  • 1,611
  • 14
  • 31
Schezuk
  • 3,752
  • 1
  • 17
  • 40
  • 36
    For the same reason there were no widespread facilities for recharging electric cars in the 1960s :-) – dave Apr 02 '21 at 10:51
  • 30
    It think it’s even worse than that. It would have never occurred to anybody in the ‘60s that someone would need to plug anything into a TV set, other than aerial antenna. I’m going to guess that if you posed in a speculative exercise the question “what would you need to change and why on a TV set for future use other than aerial”, the most likely answer would have been “direct control of the beam to convert it into a vector display, so you could run a personal RADAR and keep a lookout for ICBMs”. – Euro Micelli Apr 02 '21 at 11:25
  • 1
    Computers and VCR's became home appliances in the 80'es. It wasn't needed until then. – Thorbjørn Ravn Andersen Apr 02 '21 at 12:00
  • 5
    @Thorbjørn Ravn Andersen: Computers didn't really become common in homes until later than that, maybe mid-90s, by which time computer displays (even old CRT ones) had much better image quality than TVs (which were limited by broadcast signal quality). Not all that familiar with VCRs, but I expect they were designed to output TV signals. So the answer is that by the time there was a reason for TV RGB input, no one wanted it. – jamesqf Apr 02 '21 at 16:38
  • Why doesn't your current TV have chrono-sync inputs? – Kevin Apr 02 '21 at 17:31
  • 8
    @jamesqf Here in Scandinavia C64’s were very, very common in the mid-eighties. – Thorbjørn Ravn Andersen Apr 02 '21 at 17:49
  • I bought a mid-high-end TV in the early 90s that had composite RCA and S-Video inputs, but that was only because of the availability of devices such as VCRs, camcorders, and home computers. If you look at the design of most consumer TVs, the electronics are as simple as possible to minimize manufacturing cost to keep them affordable to the most consumers. Every added bell and whistle pushes up price and restricts marketability, so won't be added without a definite market demand. – Anthony X Apr 02 '21 at 19:53
  • 1
    Not just early TVs. In Europe you had SCART thanks to the French, but elsewhere TVs have never had RGB inputs (except for VGA input on modern LCD TVs). Component inputs on modern TVs do not work with most retro computers with RGB output even after conversion to RGB, because they require an interlaced signal. – Bruce Abbott Apr 02 '21 at 21:00
  • 5
    @another-dave Well, there were, but they were limited to electric milk floats, but in the 1960s certainly there'd be at least one or two in every town or more based on the number of independent milk delivery companies. There was one down the road from my parents' house, for example. – Dai Apr 03 '21 at 00:57
  • @Dai - well played, but I claim a milk float is not a 'car'. – dave Apr 03 '21 at 01:22
  • @another-dave I'd argue it's more like a bus – Dai Apr 03 '21 at 02:04
  • @Thorbjørn Ravn Andersen: But Scandanavia is a pretty small market, so I'd be surprised to find that anyone was making Scandanavia-specific TVs. In the US, most people - even ones like me, who worked with them - didn't start getting computers until the IBM PC started being cloned, – jamesqf Apr 03 '21 at 02:33
  • @jamesqf This was to indicate when people started having appliances in the home intended for use with the tv. The 80'es for us. Then came the need for better quality which for us in Europe was the SCART plug with RGB input in the more expensive version. I later had a 32" Triniton which had a fantastisk image. – Thorbjørn Ravn Andersen Apr 03 '21 at 09:41
  • @Thorbjørn Ravn Andersen: I don't have any real experience with Scandanavian computers. (Though I did spend some time there in the '80s, it was mostly hiking & camping.) Or indeed European computers before this century, by which time I was working with an IBM BlueGene and had large flat-panel displays. But even in the '80s, a basic Hercules graphics card gave much better text quality than what I saw with TVs, and once VGA and Super VGA came along (starting 1987) there was simply no comparison. But IIRC even the first IBM PCs usually came with dedicated displays. – jamesqf Apr 03 '21 at 22:56
  • Why did horse carriages not have USB chargers? Why doesn't your router have a telegraph port? Why do race cars not have a phonograph player? Seriously, this question is trolling... – J... Apr 04 '21 at 11:11
  • @Euro, I had cable TV in the 1960s. – prl Apr 04 '21 at 22:28
  • 1
    @prl Cable TV in the 50s and 60s was the same thing - an RF modulated signal that came over coax cable - whether it was an antenna on the other end or the cable TV company made no difference. TVs still only accepted one type of input. – J... Apr 06 '21 at 11:49
  • 1
    @BruceAbbott In fact, some Eastern Bloc TVs made in early 1990s were equipped with an RGB input. This was kind of "dictated" by quite a number of Soviet 8-bit home computers (both factory-made and DIY) producing exactly RGB video output. In addition to that, even if the TV didn't have an RGB input, it was quite probable that its color decoding module had a dedicated socket and some schematic provisions that made modding quite simple – DmytroL Aug 22 '23 at 14:48

5 Answers5

65

When colour television broadcasts began (1960s, in the UK; perhaps a little earlier in North America?) there weren't any local devices that customers might want to use. Broadcast TV was the only source of images that any home user could imagine.

Adding extra circuitry to handle separated R, G, B and sync inputs (with appropriate protections against overload etc.) wouldn't be straightforward, and certainly not cheap when receivers were generally constructed from discrete components (including thermionic valves). I'm guessing you've never disassembled a first-generation colour TV receiver?

As unit price was an important competitive element, no manufacturer would waste resources providing a feature that no customer wanted.

RGB SCART and the like were developed only when devices existed (already using the available inputs) and there was some demand to create a higher-quality picture, avoiding the modulation process. And by that time, the target displays were built using transistors, moving to greater use of ICs than discrete components.

Toby Speight
  • 1,611
  • 14
  • 31
  • 5
    I read years ago of a low-end consumer electronics company where (supposedly) the owner would look at a prototype new TV put together by the engineering department and say "what's this? do we really need it?" and start pulling parts out - as long as the TV still generated a reasonable (not perfect) image, the parts stayed out - every component removed = (lower price + higher market share) or higher profits. – manassehkatz-Moving 2 Codidact Apr 02 '21 at 14:59
  • 6
    @manassehkatz-Moving2Codidact "Madman Muntz" was the character in question, first sub $100 black and white TV set. – Dan Mills Apr 02 '21 at 16:29
  • 13
    Many older sets used a hot-chassis design. This made it necessary for them to use an RF transformer on the antenna input, but let them eliminate at least one more-expensive power transformer. Adding a headphone jack to a hot-chassis design required an audio output transformer, but that was often needed in any case. Adding a composite or RGB input would have been dangerous unless the set used a floating ground, which would have required an extra power transformer. – supercat Apr 02 '21 at 17:29
  • @supercat "hot-chassis" - as in the framework of the TV set itself carried live AC current? Wat?! – Dai Apr 03 '21 at 01:02
  • @Dai I assume not the actual mechanical metal chassis, but the internal ground reference etc. – SomeoneSomewhereSupportsMonica Apr 03 '21 at 02:25
  • @Dai sometimes, as the chassis was also the internal ground reference, one leg of rectified mains. e.g. https://www.badcaps.net/forum/showthread.php?t=10126 – Pete Kirkham Apr 03 '21 at 09:18
  • @Dai. Yes. And radios as well, of course. One wire of the AC input was connected to the chassis and, as in very many areas of the world, eg. Europe, mains sockets and plugs weren't (and still aren't) unidirectional, you never knew whether it was live or neutral. Back in those days, people didn't routinely open up the case unless they really knew what the'yre doing. :-) It was by no means the only setup but still, quite common. – Gábor Apr 03 '21 at 09:21
  • @Dai - always a hazard with audio equipment, or indeed anything without a polarized plug. Several musicians have died from their instruments being hot and the mic being a path to ground (or vice versa). – scruss Apr 03 '21 at 13:43
  • 2
    @Dai: Television sets needed to be constructed so that any parts that were connected to mains or high voltages could not be touched during anything resembling normal operation. On many sets, the power cord was molded into the back of the case, which would then plug into the chassis, such that unless one had a "cheater cord" one couldn't connect mains power to the unit with the back removed. If RF inputs and headphone outputs were isolated with both a capacitor and transformer, no mains voltages would be exposed outside the unit. A difficulty with providing composite or RGB video input... – supercat Apr 03 '21 at 16:57
  • 1
    ...is that a transformer capable of passing such a signal would need to be able to faithfully reproduce signals in the range 15kHz to 5MHz with a nearly flat frequency and phase response. Making a transformer work well over a frequency range spanning 2.5 orders of magnitude is a very tall order. By contrast, if a television set uses one transformer for VHF and one for UHF, all frequencies handled by each will be within about a factor of 2, and phase will only be relevant for frequencies within a few percent of each other. – supercat Apr 03 '21 at 17:03
  • 3
    @Dai: I suspect the cheapest way to add an RGB or composite video input that was isolated from a TV set's chassis would probably be to modulate the signal onto a higher frequency carrier, pass that through a transformer, and then demodulate the result. Since television sets already have demodulation circuitry built into them [see where I'm doing with this...]? – supercat Apr 03 '21 at 17:05
23

Early colour TVs predated VCRs and home computers by many years. Even if it did not cost much, adding an RGB input would still be a cost for something that no one would use. However, it would have been more complex and expensive than you might expect today.

badjohn
  • 2,014
  • 1
  • 9
  • 22
  • 7
    Toby's answer covers this and more. – badjohn Apr 02 '21 at 12:24
  • 3
    More isn't always better. – snips-n-snails Apr 02 '21 at 18:43
  • 1
    Adding video baseband inputs to a hot-chassis television set would have been expensive, requiring either a more expensive floating-chassis design or else modulating the signal to allow it to be fed through a narrow-band transformer. The cost savings for hot-chassis designs have gone down with time, making them rare nowadays outside small electronic devices like those found in kitchen appliances, wireless remote light switches, etc. but in the 1970s they would have been quite significant. – supercat Apr 03 '21 at 20:40
  • @supercat Indeed. I should have, even if it did not cost much. I'll edit it later when I have a more capable device. – badjohn Apr 03 '21 at 21:12
  • @badjohn While Toby Speight's answer does more expansively cover the same issue, it was posted after this one. It's normally not considered negative to this answer to have one which is posted later which states basically the same thing. In the generic case, it's possible that a subsequent answer could cover similar points and in the process show that the earlier answer completely missed something (which could be a negative). That is, however, not the case here, IMO. – Makyen Apr 04 '21 at 22:14
  • @Makyen Indeed but I just wanted to be fair and acknowledge another good answer. – badjohn Apr 04 '21 at 22:27
23

Many TV designs up into the 1970s were so called live chassis designs, which used one leg of the mains input as a reference ground. This saved materials and weight - given some early color TVs used 200+ watts at 100% duty cycle, you would have needed a rather bulky and heavy transformer, given that PSMPS technology was not really mature for consumer devices at that time. Some sets used a small transformer to supply some low voltage circuitry, while going straight off the mains for other parts of the unit - but still having the common ground, even of the transformer supplied parts, directly connected to the mains.

An RGB input is a DC coupled input, unlike an RF input.

Most home electrical systems do not use polarized plugs, or the correct polarization of plugs and sockets cannot be relied upon sufficiently to use it as a safety feature.

A DC coupled input needs a DC coupled ground - which, in a live chassis design, has a 50% chance (with an unpolarized plug) to be at 120V/240V mains live potential....

Thus, a live chassis design CANNOT have any DC coupled inputs or outputs to random external devices*, unless complex isolation circuitry (which is not trivial for a wideband and DC coupled signal like RGB video) is used.

(There is hearsay that a significant amount of people got injured attempting to retrofit audio outputs, RGB or composite inputs etc. to live chassis TVs back in the day.)

*Actually, there were some live chassis RADIOS too, which sometimes had special connectors for turntables or microphones that were in themselves completely insulated.... This kind of design would be considered insane today.


And there is yet another reason. Some color TV designs did not, anywhere in the circuitry, decode the received signal into RGB either, instead taking advantage of multiple control inputs on a CRT (eg cathode input for low bandwith chroma difference signals, grids for high bandwidth luminance) to only compose the "final" picture output within the CRT itself.


Addendum: One might wonder how so much electronics was powered "straight off the mains". Keep in mind that TVs until the late 70s often used at least some vacuum tube circuitry - and vacuum tube circuitry works great off a +150...250VDC bus, which you can relatively easily create from mains input without the use of a transformer. And even in a fully semiconductor based design, quite some of the power hungry circuitry is used to drive the CRT and the EHT inverter (usually combined into one circuit with the horizontal output stage) - this is typically not low voltage circuitry either (which is the reason why vacuum tube circuits were used for this for a long time - high voltage transistors or thyristors were LESS economical to use then. Fascinatingly, there were production color TV designs that use ICs and vacuum tubes together in one chassis.)

rackandboneman
  • 5,710
  • 18
  • 23
  • Could one have designed a set to drive the cathodes from phase-shifted chroma signals, and simply feed the received video to the grid, without bothering to demodulate chroma anywhere except in the tube? Picture quality would probably not be very good, but the amount of circuitry for that set could probably be slashed to half what would be required to properly demodulate a color signal into Y, U, and V components. – supercat Apr 05 '21 at 16:47
  • "so called live chassis designs" - any good descriptions of this? I have heard of this but still don't totally understand it. – Maury Markowitz Apr 06 '21 at 14:36
  • @supercat I think that is more or less what some designs did :) – rackandboneman Apr 07 '21 at 03:45
  • @MauryMarkowitz In non-technical terms: The whole damn thing, everything in it, is directly connected to wall power. Touch anything metallic that is part of the circuit - including a connector - and get bit. – rackandboneman Apr 07 '21 at 03:47
  • Any exposed electrical connectors needed to be isolated from the rest of the circuit. Since tube-based television sets and radios generally used transformers for impedance matching of their speaker outputs, having those transformers also provide isolation was trivial--much cheaper than electrically isolating the supply. – supercat Apr 07 '21 at 05:13
  • 1
    @supercat for audio, yes. An isolation amplifier for RGB video is in a very different league. One way to do it would be to use a carrier wave ... oh wait, that is what using an RF modulator essentially does :) – rackandboneman Apr 07 '21 at 18:43
  • @rackandboneman: My point was that on a hot-chassis design, connectors that can be touched will be isolated, and connectors that aren't isolated will be constructed so they can't be touched. – supercat Apr 07 '21 at 18:52
11

TV manufacturers didn't have a single, obvious RGB connection standard to implement. Physically, there was SCART (with competing European and Japanese pinouts), RCA, DE-9, and various manufacturer-specific DIN plugs to choose from.

Then you have the various electrical signals to send over them such as RGBS, RGsB, RGBHV, YPrPb, digital RGBI, etc.

And VCRs didn't even need RGB because Y/C was good enough (in fact, I think Y/C is good enough for early computers, also), and if you're recording off the air, the quality wasn't that great to begin with. (Prerecorded videocassettes didn't become affordable until later.)

While this was all being sorted out, device manufacturers typically provided an RF Modulator that could work with any TV through the antenna connector that every television already equipped (for example, 300 ohm twin lead screw connectors or a coax connector of some sort). So for the TV manufacturers, their job was already done. Providing more connectors for external devices quickly hits diminishing returns.

snips-n-snails
  • 17,548
  • 3
  • 63
  • 120
  • Screw connectors? I only remember push connectors. I recently disconnected my mother's TV, the aerial was just a push connector. – badjohn Apr 02 '21 at 21:36
  • @badjohn Push connectors are cheaper, but prone to unintended disconnection or shifting. Real installers use connectors with screw-on collars :) – Armand Apr 02 '21 at 21:46
  • Indeed, screw connectors are better but I've not seen them on old analogue TVs – badjohn Apr 02 '21 at 21:50
  • By "push connectors," do you mean F connectors? Because there are some F connectors that you push on and others that you screw on. – snips-n-snails Apr 03 '21 at 00:10
  • @badjohn When snips-n-snails talks about 300 ohm screw connectors, they're referring to twin-lead. It was used for the UHF side of things on the faux woodgrain TV my parents got used when I was a kid in the early 90s. – ssokolow Apr 03 '21 at 01:19
  • @ssokolow VHF also. This modulator uses twin lead and lets you choose channel 3 or 4 (around 60-70 MHz, well below the 300 MHz cutoff for UHF): http://nerdlypleasures.blogspot.com/2014/11/the-purity-of-rf-output.html – snips-n-snails Apr 03 '21 at 04:19
  • @ssokolow Sorry, I missed the 300 Ohm. I guess that it is a location thing, Here in the UK, I have only seen analogue TV aerials connected with things like this: https://www.youtube.com/watch?v=cWEUuV5Qfxc. We switched from VHF to UHF a long time ago but we used the same cables and connectors for both. My parents once had a dual standard VHF / UHF TV. Both aerial sockets were identical. – badjohn Apr 03 '21 at 08:33
  • @snips-n-snails I mean mostly cheap and nasty things. The YouTube video I just posted in my previous comment resembles most analogue TV aerials that I have seen. As I said there, I am in the UK. Where are you? I have never had cause to look at the back of a US analogue TV. The good old days, when things varied so much around the world. – badjohn Apr 03 '21 at 08:36
  • 1
    @badjohn Here in North America, we started on the 300 ohm twin-lead stuff and migrated incrementally to 75 ohm coax with F connectors. If my parents had bought a TV from the "black plastic housing" era or newer, it'd have only had an F connector. We still use F connectors for TV cable here and I remember reading somewhere that, at the frequencies we use, F connectors offer superior noise rejection to those PAL-region ones you're familiar with... judging by the Wikipedia F connector page, it may be the screw-on vs. push-on thing. – ssokolow Apr 03 '21 at 12:08
  • @badjohn I suspect the twin-lead to F connector switch-over being NTSC-specific is similar to how PAL is superior to NTSC in various ways because it came later, specifically learning from NTSC's shortcomings with Europe waiting to design and adopt something that dealt better with broadcasting in mountainous environments. – ssokolow Apr 03 '21 at 12:16
  • @ssokolow For cable TV, we use the same type of connector. However, for TV over the air, the aerials still use the flimsy push connectors. I just looked at the back of a TV that is less than a year old and it is still the same as I knew 50 years ago. One difference is that I have never used it; 50 years ago, you had no choice: use it or get no picture. I don't even have a TV aerial any longer. – badjohn Apr 03 '21 at 14:05
  • @badjohn In the USA, I've never seen the kind of coax push-on connector shown in the video link you posted. What did the UK use before that? Like in 1970? – snips-n-snails Apr 03 '21 at 15:49
  • 1
    @snips-n-snails As far back as I can remember, so middle 1960s, TV aerials have used the same type of connectors. The quality has varied from ones that fall apart if you breath on them or which won't make a decent connection to pretty solid ones with good connections. However, the format has remained the same. I could take an early 1960s TV and connect it to a recently installed aerial feed (I wouldn't get a picture since there are no analogue signals, especially not in the VHF band). Conversely, I could connect a modern TV to an aerial installed in the 60s and I might get a picture. – badjohn Apr 03 '21 at 16:51
  • @snips-n-snails I have seen 300 Ohm twin lead but only on fairly up market FM radio receivers. Even then, there is usually a coaxial option similar to that for TVs. You would normally get a T shaped antenna with the receiver. Here's an example, it even comes with a 75/300 Ohm converter in case you don't have the 300 Ohm balanced connection on your receiver. https://www.amazon.co.uk/Bingfu-formation-Transformer-Enhances-Reception/dp/B07MDZ3VK6 – badjohn Apr 03 '21 at 16:56
  • DE-9 and HD-15 are rather PC specific actually, and only became relevant in the mid eighties (the earliest DE-9 based connections - CGA/EGA - were digital RGB actually) - and even then, almost nobody that had the money to invest in a PC system (which was expensive professional grade stuff in the first half of the eighties ... and when clones came to the enthusiast scene in the second half on the 80s, so did relatively affordable monitors) would have wanted to connect it to a TV set. And if someone wanted to do that, CGA cards often had composite outs that could drive RF modulators. – rackandboneman Apr 03 '21 at 18:14
  • @ssokolow - here in the UK we used to say that NTSC stood for 'Never Twice the Same Colour'. – Michael Harvey Apr 05 '21 at 13:17
  • "Fascinatingly, there were production color TV designs that use ICs and vacuum tubes together in one chassis." - I used to work on them, in particular, the British 1971 design Decca Bradford which was 'hybrid' (transistors and vacuum tubes). It had one IC, a Motorola MC1351P which contained the audio IF strip and driver stage. Later models of this chassis had PAL decoder with an IC. Other brands used ICs from 1968. In America, RCA first used an IC in a TV chassis in 1966. – Michael Harvey Apr 05 '21 at 13:39
6

The question mentions “early PCs” that generated a TV-compatible RF signal and “the color TVs of those days”. This would be a period spanning from mid-to-late 1970s to mid-1980s. The computer systems in question would be microcomputers targeted at the home market.

By the late 1970s, new TV sets were already transistor-based and IC-based designs, instead of tube-based designs with a hot chassis. Baseband CVBS and audio inputs (bypassing the RF tuner) had started appearing on some select models. These were orignally meant for connecting a home VCR — a novel thing which was just beginning to be affordable and commonplace. But by the early 1980s, the early home computers, early home video cameras, and early video game consoles were also starting to utilize and drive up the demand for baseband inputs.

In the European market, the go-to baseband AV connector was, at first, often some variant of the round, multi-pin DIN connector, such as the one on this 1978 Grundig Super Color 8642. But this was a relatively short period. Due to an alleged French attempt at protectionism, European TV sets soon started standardizing on the larger, rectangular, multi-pin SCART connector, invented by the French.

Since the SCART connector specified, in addition to CVBS, RGB inputs (with an overlay capability, no less!), by mid-to-late 1980s, many European TV sets sporting a SCART connector effectively doubled as 15kHz RGB monitors through their SCART connector.

There were exceptions, of course. The cheaper, portable TVs still often only connected the CVBS and audio pins to their SCART connectors, leaving the RGB pins inoperational. And even many larger TVs — often equipped with multiple SCART inputs — commonly only had the RGB capability implemented on their primary SCART connector.

In the 1980s, there were also a lot of older, vacuum-tube-based sets from the previous decades still in use with no connectors for external devices, except for the RF signal input.

For such reasons, every manufacturer aiming to reach the homes and interface with the installed base had to provide an RF modulator, at least as the lowest common denominator option, and design their system around TV-compatible 15kHz timings.

RGB-capable SCART connectors also found their way on actual (15kHz, “CGA-level”) computer monitors. Popular European examples of such monitors include the Philips CM8833, the Commodore 1081, and the Commodore 1084, all of which could be used both as an RGB computer display and as a dedicated (baseband) video signals monitor for purposes such as video editing or monitoring a CCTV system. (Pro video people would use yet higher-quality video monitors with more broadcast-oriented features, such as Sonys or Ikegamis, but these entry-level monitors where good enough for security and prosumer/videographer purposes.)

One of the things that might have contributed to making RGB inputs a “natural thing” in Europe was the popularity of the Teletext system. By the end of the 1980s, a Teletext decoder (which includes a built-in RGB character generator that can sync to an external video source and superimpose the generated text/graphics on the live video) had become a standard feature on the European sets. Supporting such chip in the design is only a small step away from providing external RGB inputs. Then again, American TV sets had built-in closed-caption decoders which employed similar CG technology — and around this time, TVs also started getting crude on-screen menus which (I believe) often initially used the Teletext or CC CG chip for their video output.

Be that as it might, due to the NIH syndrome and other market-related factors (SCART RGB was basically forced on the European manufacturers by the French but North America did not have similar regulation or market pressure), American SD/CRT TV sets never got RGB inputs as a standard feature.

However, even non-European manufacturers were finally forced to add something functionally equivalent to RGB on their TVs, in the form of component (Y′Pb′Pr) inputs. This was due to the introduction of the DVD standard. DVD players required a better signal type than composite video (CVBS) or S-video (Y/C) to make the improvement in image quality they provided discernible.

European DVD players, of course, did not use component (Y′Pb′Pr) signals but had a SCART RGB connector on the back, for the best compatibility with the European TV sets. (Or rather, manufacturers usually supported both RGB and Y′Pb′Pr signals through the same pins so they could just ship the same PCB and case with a different back panel to different markets, and you could choose the output mode in the configuration menu.) Similarly, European game consoles (the fourth and fifth generation) often came with an RGB SCART cable, or had one available as an option in Europe whereas the American versions would have offered a component video cable in its place.

In conclusion, TV manufacturers added RGB signal inputs (or Y′Pb′Pr signal inputs, which is just another way of dividing the signal to three components and has comparable quality) when market demand or local regulations so required — not any sooner, and not any later. Europeans got a head start due to the French making it a legal requirement (which was a good thing from the perspective of a home computer hobbyist) but free market-driven development in other parts of the world only saw RGB-level signal quality a necessity on a domestic TV after the introduction of the DVD standard.

Jukka Aho
  • 2,012
  • 10
  • 13
  • SCART was introduced in France in 1980, not mid to late 80s. All TV sets sold in France were required to feature a SCART connector. SCART enabled the introduction of Pay-TV with the creation of Canal+ in 1984, which required an external decoder to watch its programs (including p0rn). – Patrick Schlüter Apr 06 '21 at 09:08
  • 1
    By “mid-to-late 1980s” and “European TV sets”. I referred to new TV sets produced for the (Western bloc) European local markets in general, not only France. While the manufacturers eventually went “the French way”, SCART was never a legal requirement in other European countries. – Jukka Aho Apr 11 '21 at 12:17