33

As far as I understand it, the whole personal computing revolution that Microsoft Windows did was not entirely by its own design. Is it true that the Windows OS at its core was originally designed to simply be the OS of the terminals of the Windows server architecture? Similar to how Fedora is at its core intended to be the OS of the Red Hat Enterprise Linux servers?

Neil Meyer
  • 6,275
  • 9
  • 29
  • 44
  • Comments are not for extended discussion; this conversation has been moved to chat. – Chenmunka May 06 '20 at 15:12
  • You removed the anecdote about Windows Update from your question, but if you post a separate question about that you might get some interesting answers...because the anecdote is almost completely backwards. – user3067860 May 06 '20 at 17:20
  • 2
    It needs to be remembered that "DOS" was originally "QDOS" -- "Quick and Dirty Operating System" marketed by Seattle Computer Products, then later sold to Microsoft. And IBM had a windowing UI (can't recall if it was OS/2 or DOS based) before Windows became a thing, but IBM never pushed it. – Hot Licks May 07 '20 at 00:48
  • Game setup. Please select your sound card: Soundblaster, Gravis, AWE 32. It was that kind of pain and the fact that MS were in the right place at the right time. – Frank May 07 '20 at 06:45
  • @HotLicks: I'm somewhat confused – you're not referring to IBM's 1988 Presentation Manager for OS/2, are you? Because I always thought that this was IBM's first GUI for PCs. But Presentation Manager is basically a contemporary of Windows 2.0. – Schmuddi May 07 '20 at 10:17
  • This should be moved to SE.Religion and re-titled "How did demons come to operate purgatory" – johnDanger May 07 '20 at 15:56

8 Answers8

82

The short version is that Windows became the de facto operating system thanks to Microsoft’s business acumen (or shenanigans, depending on your point of view), marketing, skilled developers, a strong focus on backwards-compatibility, and the success of MS-DOS.

The success of Windows in general can be traced back to the success of Windows 3.0, which has already been addressed here. I’ll try to explain how we ended up with a “server-oriented” operating system, Windows NT, as the de facto operating system on home PCs nowadays.

Early versions of Windows (in the first half of the eighties) were developed in the context of a perceived “rise of the GUIs” (see also the Apple Lisa and Macintosh). They weren’t all that successful, until some skilled developers implemented new features in Windows which took full advantage of newer CPUs (286 and up), leading to the surprise success of Windows 3.0. The latter gave rise to a large ecosystem of software, which helped cement the Windows advantage: users started using Windows not only as a graphical shell, or a GUI for a small number of applications, but for most if not all of their computing needs. This led PC manufacturers to include Windows by default with their PCs (where previously many would only ship DOS), helped later on by Microsoft’s contracts which strongly encouraged bundling. PCs changed from “DOS PCs” to “Windows PCs” by default.

In parallel, Microsoft had been working on a replacement operating system for DOS for a long time; not Xenix, but OS/2, and after the fall-out with IBM, NT. This ended up targeting high-end PCs of the time, taking full advantage of 32-bit CPUs, with a high-level architecture resulting in a more portable and maintainable operating system, but also a more resource-intensive operating system. As a result, it was marketed for servers and workstations initially, effectively as a competitor to Unix (and mini-computer operating systems).

The success of Windows 3.0 started the ball rolling on the unification process (which would take a long time): Windows NT switched from an OS/2-based API to a new 32-bit API, Win32, based on the existing 16-bit Windows API (this happened before the first release of NT, 3.1). But NT was still too big for home (and office) computers, and was really bad at doing one important thing for home users, playing games. The UI was very similar across all three operating system lines at the time — compare OS/2 1.3, Windows 3.0, Windows 3.1, and Windows NT 3.1.

The parallel development streams continued: Windows 3.0 was extended with multimedia features (MPC and then Windows 3.1) and, importantly for office use, networking features (Windows for Workgroups). Windows also started being bundled with computers, ultimately in combined MS-DOS 6.22 / Windows for Workgroups 3.11 packages.

At this point non-NT Windows still hosted 16-bit applications, which was increasingly limiting (and bad for publicity, for the relatively small number people who understood such matters, since pundits had definitely moved to a 32-bit future). It’s hard to imagine nowadays, but at the time, many in the computer press wondered what the future would bring: Windows 3 was stuck with its “16-bit GUI on top of DOS” roots, NT wasn’t really there yet, OS/2 was mis-marketed, and Unix could never be the next new thing because it was already old at the time (and more importantly, heavily fragmented). Even the hardware side was up in the air, up until Intel released the Pentium: many thought the next big thing would come from the RISC world.

In Windows, there were a couple of stop-gap measures (Win32s and WinG), but the full switch to 32-bit applications really started with the release of Windows 95. The latter was an enormous event, not only in the computing world, but in the (Western) world at large: Microsoft had been promoting “Chicago” (the code name for what was supposed to be Windows 4.0 initially, then 95) for years, and the release event was coordinated around the world with massive media presence (including, in the UK, a specific edition of The Times), and a whipped-up frenzy which resulted in long queues in front of computer stores...

By this point the home market was a done deal. Windows 95 and 98 were the operating system for home and office PCs, but they were still a support nightmare (and Windows Me even more so). NT was getting better all the time, and Microsoft pushed to unify the two streams in Windows 2000 and then XP — not by changing NT all that much, but by adding good support for games (thanks to DirectX and a re-vamped graphics driver architecture) and for older software (with Sound Blaster emulation for old DOS games in particular). Windows 2000 was a great gaming platform at the time, at least with Windows games (Counter Strike, Unreal Tournament, Age of Empires II...).

There were of course a number of questionable practices in all this, the two major ones being pre-announcements (pre-announcing MS-DOS 5 and 6 helped counter DR DOS, and pre-announcing Windows 4 helped counter OS/2), and bundling agreements with PC manufacturers (who could include DOS and Windows at reduced cost, but only if they paid for the bundle on all the PCs they sold).

Is it true that the Windows OS, at it's core was originally designed to simply be the OS of the terminals of the windows server architecture. Similar as to how Fedora is at it's core intended to be the OS of the REL servers.

Both of these statements are incorrect, but it would take another answer to address them...

Stephen Kitt
  • 121,835
  • 17
  • 505
  • 462
  • 6
    I would add two nitpicky details:

    First, Microsoft made printer compatibility a priority for Win 3.1, which help seal the deal for third-party vendors thinking of porting their office/productivity packages from DOS.

    Second, the dream (by some) that the Mac would overtake Windows was decimated with the release of Win 95. Of all the major alternatives to Windows, the Mac was the only one you could truly say targeted consumers, and Apple failed to capitalize on it until Jobs returned.

    But this is small stuff compared to the broader picture. Great answer.

    – Jim Nelson May 05 '20 at 18:04
  • 2
    I think an important point in the historical evolution of the NT kernel becoming the standard home desktop kernel was the catastrophically bad Windows ME that drove Microsoft to pushing out a replacement ahead of what had been originally planned. If ME had been a decent upgrade to 98SE, then XP probably would not have happened (at least, not when it did). – asgallant May 05 '20 at 18:36
  • 13
    Maybe some mention of Windows 2000, which introduced many of the 9x features into NT for the first time (like FAT32, WDM, and DirectX 7). XP was the spit-and-polish release, but 2000 is really where the streams merged, and some of those "in the know" were running Win2k at home in 2000-2002. – hobbs May 05 '20 at 21:20
  • 6
    @hobbs I remember preferring 2000 over XP for quite some time, if my memories are correct it was more stable and less of a resource hog. And I considered XP "friendly" GUI to be ugly :] If I recall correctly, it took some service packs to make XP usable for me. It pretty much began scheme of server and consumer Win versions built on same kernel. – PTwr May 05 '20 at 23:00
  • Why we ended up with a “server-oriented” operating system. If it ain't broke don't fix it (anyone who would know how to, makes seven figures anyway). NT, released in 1993 "was produced for workstations and server computers". The kernel is still in use today: it's called, 10. - If I had to airgap a cpu for something important, it'd be running 2000. My personal best uptime was eight months before an intentional restart. – Mazura May 06 '20 at 01:13
  • @hobbs - "2000 is really where the streams merged", and then they never diverged. Companies with billions of dollars on the line is what drove uptime capability, whom needed workstations and servers that worked. It's a lot easier to take stuff out of something that works then to reinvent the wheel. 2000 was awesome because it really didn't have much to slim down to build a gaming rig, and it did all the things I expect a cpu to do. – Mazura May 06 '20 at 01:33
  • 2
    @hobbs: So true!! Why does everybody always forget to mention W2k? This was the best OS of that time IMHO, much better then XP. XP compared to W2k seemed to me like W8 to W7: Design-nonsense became too important. Maybe W2k is ignored so often because there was a Windows ME, which was the direct follow-up of W98. WinME from my experience on a friend's computer was a flop: frequent breakdowns, hardware incompatibilities. But WinME was still targeting home users, while W2k was rather targeting business customers, so people may see it as the WinNT follow-up and not yet as the merge that it was. – Tobias Knauss May 06 '20 at 05:39
  • 1
    @hobbs thanks for the reminder, 2k was indeed a great version of Windows! – Stephen Kitt May 06 '20 at 07:20
  • Also I believe some popular version of Windows was deliberately coded not to work correctly under DR-DOS. – Thorbjørn Ravn Andersen May 06 '20 at 07:55
  • 3
    @Thorbjørn a beta version of Windows 3.1 printed a warning on DR DOS, that might be what you’re thinking of; that led to lots of FUD afterwards, but Windows itself worked fine. There was a “business update” to DR DOS 6.0 which circumvented the check, but it wasn’t needed for the retail version of Windows 3.1. – Stephen Kitt May 06 '20 at 08:06
  • 1
    @StephenKitt I was thinking along https://www.theregister.co.uk/1999/11/05/how_ms_played_the_incompatibility/ I did not encounter the problem myself – Thorbjørn Ravn Andersen May 06 '20 at 08:14
  • @Thorbjørn right, that refers to the AARD incident and a couple of other messages; but as the article suggests, there were no real incompatibilities, just strategies by various people at Microsoft to make users believe that Windows wouldn’t work properly on DR DOS. (My anecdotal evidence is that I ran Windows 3.1 on DR DOS 6.0 for years without any problem.) – Stephen Kitt May 06 '20 at 08:23
  • 2
    @Mazura Windows 2000 was a magnificent creature, wasn't it? I don't think I got as far as eight months, but I know I regularly hit four or five before I had some reason to power down. – T.J.L. May 06 '20 at 15:10
  • @JimNelson - "Mac would overtake Windows" - Apple raised prices across the board in late 1989. Much cheaper and faster EISA 386 and some 486 based clones were already being sold by then, and Windows 3.0 was released in May 1990.The clones took over sales, during the 1990's, the top 20 PC makers only accounted for about 50% of sales, the rest sold by mom and pop shops using off the shelf components. Mac share of sales went from 25% down to 5% during this transition. – rcgldr May 10 '20 at 21:38
  • I have an M-Audio Delta 1010LT, a PCI card that turns your cpu into a DAW. It is (was?) the cheapest and most capable card ever available at ~$200, a price point that never changed throughout its entire decade long service life. If I want to run that card I have to build a W2k machine because there's no drivers for it past that. It would coincidentally be the most stable running computer in 2020, and it'd be good for recording 8 tracks of CD quality audio for hours at a time. I would record entire sets with the monitor off; the only time it would lock up was due to lack of HDD space. – Mazura May 24 '20 at 23:57
35

The other answers include a lot of sound historical information about how Windows evolved into its dominant role on PC's in both the home and business environment. But I think the most fundamental, simplest, "Occam's razor" answer is that consumers never had to make a choice. It was PC manufacturers that chose Windows as the default OS, not users and consumers.

Aside from the one-time media event that accompanied the release of Windows 95, home PC buyers have been accustomed to simply buying PC's with the assumption their purchase would include software. After all, a PC without software (the OS, in this discussion) is not serviceable for any task. In the mind of a consumer, the PC and its OS are inseparable parts of the product. It would be like buying the chassis of your car separately from an engine - not likely - and few buyers even bother to "look under the hood".

Based on this analogy, it is easy to understand that the behind-the-scenes dealings between Microsoft and PC manufacturers dictated this outcome much more than any technical merits or detractors associated with Windows - and its long history provides many examples of both. I'm not claiming that Windows either succeeded because of its merits or despite them. Simply that such concerns pale in comparison to the fact that almost all PC purchases included Windows, as a default. Don't bother consumers with deciding what software is included. As long as it is sufficiently serviceable and compatible with the applications, the buyers simply don't care.

Much has been written on Microsoft's business approach to bundling their software with consumer hardware. It really began with Microsoft Basic, continued with MS-DOS, and reached its peak with Windows.

Brian H
  • 60,767
  • 20
  • 200
  • 362
  • 14
    it would be remiss to mention this facet of the spread of MS Windows and operating systems without mentioning the monopolistic behaviors of Microsoft (for which is was sanctioned) that curtailed the potential growth of competitive environments. https://en.wikipedia.org/wiki/United_States_v._Microsoft_Corp. – Will Hartung May 05 '20 at 13:37
  • 5
    @WillHartung I think you are right. But even Microsoft's behavior vs. competitors is a small contribution compared to their superior understanding of consumer behavior. – Brian H May 05 '20 at 13:48
  • 9
    This answer is also missing the critical fact that in order to get OEM licensing for Windows, OEMs had to pay for an OEM license for every PC shipped, regardless of whether Windows was included, which slaughtered any competitive market in a landscape when every other component could be customized. – chrylis -cautiouslyoptimistic- May 05 '20 at 18:14
  • 1
    @chrylis-onstrike- It's a good point that goes to the heart of some competitors' legal complaints against MS bundling. Those complaints did not resonate with typical PC buyers. We techies may have loved a "Choose your OS" option at purchase, but typical users would just be confused by it... – Brian H May 05 '20 at 18:48
  • 2
    If the "typical users" weren't confused by specific options for RAM and hard drives, or for that matter several different versions of Windows, a "no Windows" option wouldn't have been any problem. For that matter, the typical user is probably substantially less aware of such matters today, and "no OS" options are available from most manufacturers that make configurable PCs. – chrylis -cautiouslyoptimistic- May 05 '20 at 19:04
  • @chrylis-onstrike- "The one with higher numbers is more powerful" was a pretty common heuristic. Even AMD vs Intel is mostly a branding thing. But Windows vs Linux is pretty much "my computer is completely broken" territory. The vast majority of PCs sold are not configurable even today (and computer users are even less knowledgeable than they were in the past). – Luaan May 06 '20 at 11:21
  • @Luaan Exactly my point: In the past, users were on the whole much more knowledgeable, and a "no OS" option is rather unambiguous. – chrylis -cautiouslyoptimistic- May 06 '20 at 11:24
  • 1
    @BrianH this is not entirely true. Win95 was the first that was commonly bundled with the machine, and even that not always. MS-DOS was a standard and it was even difficult to buy a machine without it, but before Win95, Windows was either an option or you had to buy it seperately, and even with Win95 it was easy to buy a PC with just DOS (I did just that for a BBS system I ran). Around Win98 it became almost impossible to get a computer without windows pre-installed. – Tom May 06 '20 at 12:35
  • @Tom My own recollection of Windows 3 is slightly different than yours. It was easy to buy a PC with Windows 3.1 pre-installed (along with DOS, obviously) and it was usually included in the price of the new PC. This making it a "no-brainer" for most all PC buyers, even if some opted for DOS only. – Brian H May 06 '20 at 14:23
  • @BrianH these statements are not mutually exclusive. Yes, you could buy PCs both with and without Windows at that time. – Tom May 07 '20 at 04:04
  • RE: "PC manufacturers that chose Windows as the default OS, not users and consumers." -- I disagree. Users and consumers buy a computer and OS for the applications that can be run. Thus it was the multitude of software vendors selling windows applications that sealed Windows becoming the default OS. – MaxW May 08 '20 at 04:03
15

How exactly did Windows become the OS of the home PC?

Is it true that the Windows OS, at it's core was originally designed to simply be the OS of the terminals of the windows server architecture.

No. Windows started out as a GUI component of DOS - eventually hiding DOS beneath. Anything like a windows server architecture was only devloped way later. In fact, much later than the time Windows became a stand alone OS (wich is, depending on your PoV OS/2 or Windows NT)

So if this is true how exactly did a OS intended to support a companies server offerings become the defacto standard of home computing?

Because it is not true. Windows became a thing for home users way before any server architecture and mostly thru several factors:

  1. It offered a GUI to existing PC users. At that point it was in fact rivalled by GEM, which was, for some time quite ahead in sales of Windows, at least in Europe that is, or more integrated solutions like DeskMate or late comer GEOS.

  2. PC-Hardware became rather cheap during the late 1980s, early 1990s, eventually undercutting equal powerful high end home computers like Atari ST, Amiga or Acorn. Windows was the default counterpart to their offerings for a GUI

  3. During the mid to late 1990 the PC became a serious game platform, even more rivalling existing home computer platforms. With evolving graphics and sound capabilities, as well as (at the time) almost endless memory, 486 machines draw in game development - not to mention the ability of DOS for CD support, easing the distributing of ever growing games. This again cemented the PC as base hardware and while many of these games were DOS based using various extenders (DOS/4GW made this race due being included with Watcom C), Windows was a simple choice for all other tasks - even more so when game companies offered kind of integration to the desktop.

  4. The most important, single turning point by Microsoft was the introduction of DirectX in 1995/96, with Windows 95. While it wasn't complete new (there was WinG before), it was the dedication to a unified set of APIs to cover all (well, most) hardware out there - this came at the same time when the 32 Bit API was stabilized, making DOS-Extenders obsolete, creating the perfect wave for game developers to switch - which in turn made even hard core DOS users install Windows - and the remaining home computer users switch.

  5. Last, but for sure not least, the DOS/Windows-Tax, that forced system manufacturers to either supply some alternate OS (and still pay to some degree) or simply deliver their machines with DOS/Windows. This was heavily enforced by MS during the late 1990s, thus making next to every brand machine being delivered with Windows installed by default. Users had to active change this - something most never even think of.

So in conclusion, Windows 3.1 became usable in ~1993 as (application) Desktop, Windows 95 made 1996 the turning year, were DirectX was able to pull developers over from DOS. And it proved to be a cornerstone up till today. All with a more or less gentle nudging by lawyers waving their version of a 2x4.

Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • "Anything like a windows server architecture was only devloped way later." Windows NT was contemporary with Windows 3.1, and pre-3.1 Windows weren't exactly popular. And even early Windows were almost standalone - while they used DOS for file management, all the other OS stuff was Windows (unsurprisingly, given how thin of an OS DOS was). Calling Windows a "GUI component for DOS" is doing it a disservice - it was a full-blown OS with optional fallback to DOS drivers and using DOS as a pseudo-bootloader. – Luaan May 06 '20 at 11:31
  • 1
    The big milestone for Windows was 2.1 - which allowed multi-tasking with DOS applications running in Windows. In other words, just like all successful MS products, it pushed application compatibility to ridiculous levels. While their competitors (in other ways superior) had only a couple applications, Windows gave you access to pretty much the whole software market on the PC. And when you developed software, you chose the ultra-compatible platform that runs everything. That's the biggie, and that's why there have been very few threats to Windows over the decades (e.g. Java). – Luaan May 06 '20 at 11:34
  • @Luaan You might want to relate to the OP context fr the 'server' part. Also, Windows, all the way to 95, wasn't 'fall back' to, but sat on top of DOS, based on its services. Also, only certain well behaving DOS applications worked windowed. And last but not least, the multitasking part was only true within the GUI component, as each and every DOS access was serialized, halting concurrency. So no stand alone OS. Beside that, why so aggravated? Taking a step back and looking at what was really archived is usually more rewarding, isn't it? – Raffzahn May 06 '20 at 11:50
  • 1
    #2 is entirely correct. As a un-enlightening detail for those interested, I'd add that those alternatives were actually much more powerful initially. The problem was that the relatively open platform (and larger installed base) of the PC meant that every hardware firm in the world was free to work on improving it, and had the financial incentive to do so, so those alternatives didn't stay more powerful for very long. (Which is was vital to getting to point #3...) – T.E.D. May 06 '20 at 16:22
  • Windows wasn't based on DOS services, with the exception of the file system, and backwards compatibility with DOS device drivers. Everything else - memory protection, multi-tasking, device access etc. went through Windows (including DOS apps since 2.1 on the 386, which was a game changer). And in fact, file system access (which went through DOS) was a typical point where Windows applications yielded execution to other applications. You couldn't have multiple applications accessing the file system simultaneously, but that was rarely a problem. – Luaan May 07 '20 at 08:59
  • @T.E.D. "More powerful" depends a lot on what aspect you're looking at. The IBM PC didn't have much in terms of dedicated hardware - like music synthesizers, image blitters etc. Most of the hardware was pretty bare bones. But it had far more raw power - you had much more available memory and CPU even in early IBM PCs. This meant that Amiga and Atari were very powerful in practice, but also a bit less flexible - e.g. they were wonderful for music production, games and such, but got trashed at boring stuff like table processors... and eventually Wolfenstein and its ilk. – Luaan May 07 '20 at 09:02
  • @Luaan - The first PC had a 4.66MH 8088 and could only address 256K of memory. Add a hard drive, and this was still a system IBM was making and selling when the Amiga came out in late 1985. The AT came out in '84 with a 6MHz 2086 and max 16M (but only 1MB addressable at a time in Real Mode). The Amiga (1000) came with 256k, (almost every user promptly upgraded to 512) and had a theoretical max of 8.5mb. Its 7.16Mhz M6800 was universally accepted as a superior CPU to the 8088, and wasn't used in the PC design only because it wasn't ready yet when the PC was being designed. – T.E.D. May 07 '20 at 13:15
  • @Luaan - The reason Wolf 3D wasn't really as doable on the Amiga when it first came out had to do with the entire game being designed around the capabilities of VGA cards, sticking only to what they were good at and carefully avoiding what they were bad at. These cards of course operated completely differently than the Amiga's graphics hardware, so none of those tricks (and most of those limitations) weren't applicable on that platform. – T.E.D. May 07 '20 at 13:21
  • 1
    @T.E.D. Not really true. The first PC had a 4.77 MHz CPU and could address 1 MiB. Mainboard RAM was limited to 64 KiB - 256 KiB in the second, 1982 Version. And that wasn't 'theoretical' - in both RAM was by default expandable to 1 MiB (minus 48 KiB for ROM). IBM offered RAM cards from day 1 for up to 640 Ki. Third party (or with some fiddling) could fill the whole address space. Also, the 68k was ready (and used) long before the PC was designed. Last but not least, it wasn't superior in any way, in fact, at the same time the x86 always had an advantage (we already had such a question on RC). – Raffzahn May 07 '20 at 13:28
  • @Raffzahn - Should probably go fix Wikipedia again then: "Eggebrecht wanted to use the Motorola 68000, Gates recalled.[13] Rhines later said that it "was undoubtedly the hands-on winner" among 16-bit CPUs for an IBM microcomputer; big endian like other IBM computers, and more powerful than TMS9900 or Intel 8088. The 68000 was not production ready like the others, however; thus "Motorola, with its superior technology, lost the single most important design contest of the last 50 years", he said." – T.E.D. May 07 '20 at 13:36
  • @T.E.D. Go ahead and do so. Just, why? Even the Wiki articles describe that there more and well founded reasons, like quality and cost, than 'not ready', espacially as the later is simply not true. The 68k was introduced in late 1979 and available to customers since early 1980 (Feb?). The IBM PC design was made in June 1980, so availability was not an issue. Beside that the superiority didn't exist either - it was only an extreme hype about nerds - but that's a different story. – Raffzahn May 07 '20 at 14:08
  • @Raffzahn - From the Bill Gates interview with PC Magazine (March 25, 1997): "And the key engineer on the project, Lou Eggebrecht, was fast-moving. Once we convinced IBM to go 16-bit (and we looked at 68000 which unfortunately wasn't debugged at the time so decided to go 8086), he cranked out that motherboard in about 40 days. ". But again, if you think you can support a different interpretation, feel free to go fix Wikipedia. – T.E.D. May 07 '20 at 14:22
  • @T.E.D. Lets see ... Wiki Entry for 68k: "Formally introduced in September 1979, initial samples were released in February 1980, with production chips available over the counter in November" so, which one is to be changed? For support, I could dig our an old Exorciser board which has, and I'm pretty sure, an early 1980 date stamp (if not 1979) as I got it in summer 1980. Beside that, you're aware that you argue with anecdotal evidence from a man denying his famous 640k quote? Wikipedia is neither always right, not (as shown) consistent. So why not focusing on more plausible reasons? – Raffzahn May 07 '20 at 14:39
  • Windows NT 4.0 (1996) supported Direct X 3.0. The racing game Need For Speed II could run on NT 4.0. – rcgldr May 08 '20 at 16:46
  • 1
    @rcgldr Oh yeah, NT 4 could run quite a few games, and Windows 2000 was pretty much on par with 95 (especially with VDMS to give full VESA and sound capabilities in the console). Sadly, many games checked for version instead of DirectX, and rejected all NT systems regardless of actual capability. 2000 came with compatibility mode that let those games think they're running on 98 and everything worked swimmingly. – Luaan May 10 '20 at 12:56
6

The history of Windows goes back a long way. Windows 1.0 was released in 1985 and was simply a graphical interface for MS-DOS. This was neither revolutionary nor uniquely Microsoft, but a trend at that time. For example, GEOS appeared in 1986 and was the same thing for the C64. There was also GEM and a couple others. The development times on these projects make it highly likely that they were parallel developments instead of copies of each other. Instead, they are both answers to Apple's Mac OS from 1984 (and of course all of them are based on the Xerox PARC research).

Windows 1.0 was a dud. Nobody cared. It was technologically inferior to pretty much every competitor, and didn't offer any actual advantages for the user.

Windows 2.0 (1987) became famous mostly because Apple took Microsoft to court for a number of license agreement and copyright violations. In the marketplace it was another failure, its technology started to catch up to competitors but the added value for the user was minimal, about on the level of the "multitasking" of the iPad today. Like 1.0 it was a program for MS-DOS, so you had to start your computer into the MS-DOS command line and then load Windows up separately.

Windows 3.0 (1990) was the first that was successful and, while still technologically behind the competition, had caught up enough to become popular with users, especially in the commercial sector. For the private sector, a few early games tried but largely failed (here's a few games from those days, and most of them are for 3.1) Again, Windows 3.0 was an MS-DOS program, not an operating system.

Windows 3.1 (1992) and 3.11 (1994) were the popular ones that really started Windows as a thing. Wikipedia reminded me that they finally added drag & drop in that version. So I'll refrain from mentioning again that it was still playing catch-up with other GUI systems.

And, guess what, it still run on top of MS-DOS. Windows 95 was the first Windows that was not an MS-DOS executable, but a bootable operating system.

Sorry for the long introduction, but it is necessary to clearly show that the success of Windows cannot be viewed independently from the success of MS-DOS. With MS-DOS being the dominant operating system at the time (for reasons discussed in various lawsuits and a near unlimited amount of articles and comments) the Windows GUI was riding on the dominance of the MS-DOS OS for a very long time, and only became an actual operating system when it had already achieved dominance of the market. For the vast majority of people, both privately and commercially, the decision at that time was not between Windows 95 and another OS, but whether to move to 95 or stick with 3.11

This history also answers your other question. No, Windows wasn't a server-terminal system. In fact, Windows NT 3.1 - the first server OS from Microsoft - appeared in 1993 and was basically Microsoft's fork of OS/2. It had a so-so reception in the market, unlike its successor NT 4.0 (1996) which as I remember (I started my IT career around that time) was the first server-side Windows OS that professional IT people took seriously.

By that time, Windows was already dominating the client-side market, and I would guess that it was rather the other way around: With NT, Microsoft leveraged its near-monopoly on the desktop market to gain entry to the server market.

tripleee
  • 134
  • 7
Tom
  • 160
  • 5
  • 1
    I used W2.11 and we successfully depolyed it with applications on 6 sites. This was a work thing. 2.11 was too clunky to use at home. – cup May 06 '20 at 10:57
  • 1
    as far as I remember back NT 3.5(1) was the first usable of the NT streak I met ... while NT4 basically adopted the Cairo / 95 frontend on the NT core – eagle275 May 06 '20 at 11:18
  • Yes, 3.5 was a huge step up from 3.1 - I just couldn't list every iteration or it would've all been twice as long. :-) – Tom May 06 '20 at 12:33
  • 1
    You make it look like Microsoft stole NT from IBM - in fact, they were developing OS/2 together. OS/2 3.0 was when the cooperation broke down. But it wasn't Microsoft's first server OS, even ignoring the joint OS/2 development - that would be Xenix, which (originally licensed from Bell/AT&T) was by far the most common UNIX of the 80s. They started from UNIX 7, but quickly expanded it and ported it to new hardware. MS-DOS had a huge advantage in hardware requirements, compared to both UNIXes, OS/2 and (much later) NT. But the OS MS pushed was Xenix, until they switched to NT. – Luaan May 07 '20 at 09:17
  • @Luaan yes, it was initially a joint-venture. I said "fork", not "steal" or "copy", hoping that would be clear enough without going into the fine details. – Tom May 07 '20 at 13:06
  • I wrote a commercial program targeted at Windows 2.1+ that was successfully sold to a certain vertical market. But, yes, it did better when Windows 3.0 was released. – davidbak Nov 18 '22 at 21:43
3

As far as I understand it, the whole personal computing revolution that Microsoft Windows did was not entirely by its own design.

As several earlier posters have noted, it was almost entirely by Microsoft's design. We are talking about the age where MS Executives were comfortable talking about "cutting off [their competitors'] air supply". The marketing and competitive warfare was brutal, but largely invisible to the end user population. It was not much different from the IBM at the time, so MS-DOS on the IBM PC was probably a match made in heaven (added to the fact that it saved the IBM executives from having to deal with a woman, which they would if they had gone with CP/M, the dominant DOS at the time).

Is it true that the Windows OS at its core was originally designed to simply be the OS of the terminals of the Windows server architecture?

The first version of Windows I saw was 2.0, running (if you can call it that) on an IBM AT. The experience was horrible - even by the standards of the time. In the mid 80s, I remember working for a company whose corporate policy was a desktop computer was an MS-DOS machine (well, they were making them...). We actually had a Macintosh in the PC room (this was long before the days when every desk had a PC on it), but it had been officially described on the purchase order as a "Graphical Design Aid". I was officially writing code on ISPF on an IBM 370 Mainframe, but it took over a year for me to obtain a 3270 terminal on my desk. A couple of us engineers were playing around with early Amigas, marvelling at the elegance of the core software.

Similar to how Fedora is at its core intended to be the OS of the Red Hat Enterprise Linux servers?

RedHat don't make servers. They distribute Linux-based OSs and other software. Fedora is a community distribution of Linux (GNU/Linux, if you must) by RedHat. RedHat use it, in part, for road testing their new technology. RHEL is designed as an enterprise operating system, with long term stability and paid-for support. Whilst most of the tech and architecture in RHEL has passed through Fedora, it's not fair to say "Fedora is the core of RHEL". They are kind of forks of the same codebase, rather, with the Fedora fork always supposed to be a bit more bleeding edge. Incidentally, you can get the full source to both RHEL and Fedora (the Centos project, among others, is a community rebuild of RHEL).

So if this is true how exactly did an OS intended to support a company's server offerings become the de facto standard of home computing?

Well, your premise is not true, but the "server" core did replace the old desktop core.

I don't remember any Windows Server Architecture with Windows 3.1 or before. The concept of Windows as a server only comes along with NT, IIRC. By then I was managing a set of HP-UX servers. At the time, if you weren't in the IBM camp (mainframe or midrange), you had a choice of DEC VMS or various proprietary Unix implementations. DEC had emasculated their next generation GUI OS (Project Mica) before it saw the light of day, and MS happily cherry picked many of the best (and disgruntled) Mica engineers to work on what became NT.

The Intel-based desktop machines were hamstrung by their 16-bit software model, and the early versions of Windows had to exist within those limitations. When NT came along, MS' pitch was Windows NT Server and Workstation in the enterprise space, and Windows 3.1 at home/small business.

The dichotomy between the two versions became a thorn in MS' side over the course of many years. The low-end version evolved from 3.1 (successful) to Windows 95 (very successful) to Windows 98 (disastrous), but the pressures on the core, like poor 32-bit support, and the lack of any inherent concept of user identity/permissions, meant, especially in the age of the ubiquitous Internet, massive problems. The ultimate fix was to rehome the graphical shell on top of the NT core, creating a more or less single Windows architecture, in the several variants which we see today.

No-one foresaw the elimination of the original desktop Windows codebase - it just happened because it was the most pragmatic solution to the engineering morass MS were falling into at the time (it had happened to IBM, too, in the 1960's: in that case it led to the 360 mainframe system).

I remember a grey-beard after one IT moaned at the auto update feature of Windows 10 just mention in passing that this quirk for the home user is probably a design feature that has something to do with servers as a terminal of a server would probably never want to skip an update.

Actually, it's got more to do with life on an ever-hostile Internet. Before Windows Update came along, just keeping Windows current with security patches was a massive exercise, and a lot (maybe even the vast majority) of owners did not keep their machines patched. The result was massive problems, both in the enterprise and in the home. And when the easiest target of malware is also the predominant platform, you have the makings of a perfect storm. The way Windows is built means a distribution mechanism for updates like you get with RedHat or Ubuntu is a lot harder (the package management systems in most Linux distros provide pretty good decoupling and dependency management) to implement, which is why I think it took MS so long to get it to work properly. Even then, it was originally something you had to take responsibility for by running it yourself. It took quite a while for the realization to dawn that the only proper solution was to make the process automatic, at least for the critical stuff.

So this greybeard (who worked servers, networks and security in the nineties and the noughties) has no compunction in telling your aforementioned greybeard to pull his head in - the alternative is so much worse.

mikb
  • 31
  • 1
  • i may have thought Ms-Servers a bit more ubiquitous then it really is/was – Neil Meyer May 06 '20 at 13:12
  • haha, I think the younger bearded fellow was getting a bit of stick from management for the increased bandwith usage the update system was making a reality. – Neil Meyer May 06 '20 at 13:19
  • @NeilMeyer back then, there were more commercial server options that could also serve Windows clients (eg Novell NetWare). – rackandboneman Jan 13 '22 at 14:45
2

When IBM went into the Personal computer business their design became a standard for the industry and IBM compatible computers quickly became a majority of personal computers produced. MS-DOS from Microsoft was the operating system that IBM picked and the other manufacturers used it as well. When Microsoft came out with Windows they already had a big advantage. As Software programmers were writing software they concentrated on the operating system that had the largest share of the market which was Windows. When people were trained to use computers the same thing happened many were trained on what was most common. So Windows market share advantage became self reinforcing.

exploregis
  • 121
  • 1
  • Yes this is an important point. Windows became prominent because MS-DOS was already widespread. Microsoft already had their foot in the door. – Kingsley May 06 '20 at 22:19
2

I feel like all the answers miss the key point that catapulted Windows above the others:

Microsoft made it very easy for developers to use.

This has been a Microsoft strength for a long time. The success of any platform is going to hinge largely on developers coming in and writing programs that use it. Unix suffered (and its descendants continue to suffer, but it was a huge problem in the 80s and 90s) by being fractured — a program for one flavor of Unix or Linux does not necessarily work on another flavor and must be compiled separately. Apple suffered by being too much of a closed system. Meanwhile, Microsoft begins developing concepts like DirectX, and IDEs like Visual Studio, and powerful debugging tools like WinDbg. These really cemented Windows as the developer's platform of choice.

Maybe someone else could elaborate about the earlier days (8088–80386 era in particular) but I believe it all holds true: Microsoft (and Intel) were more interested in creating an open platform, available to developers, while Apple wanted to keep a firm proprietary grip and Unix simply could not settle on a standard, leaving Intel/DOS/Windows architecture as a clear choice for any developer who wanted to have a broad audience with a single executable.

("Single executable" was also more important in the 80s and 90s when you couldn't simply expect people to go download the one they need! Or worse, do their own compiles.)

Stephen Kitt
  • 121,835
  • 17
  • 505
  • 462
JamieB
  • 121
  • 2
  • 1
    This is a good part of it that is underrated, thanks! Especially consider that for Windows the Windows group prevailed upon the Languages group and got the entire toolchain (compilers + SDKs) released to anyone for free!. Free as in no charge at all! Free as in $0 compared to $>250 for MS compilers up to then! Compare to Apple and the Mac and their pricy (but deluxe! hardcover manuals!) SDK! (I don't remember what the Apple SDK subscription cost but I had a friend who was an independent Mac developer and I remember he thought they were expensive. (And he was used to HP's prices!) – davidbak Nov 18 '22 at 17:17
  • @davidbak when was the Windows SDK free? I remember it costing significantly less than, say, the OS/2 SDKs, but it still cost money (whether acquired separately, or bundled with Microsoft or Borland’s compilers, or later through MSDN). – Stephen Kitt Nov 18 '22 at 17:29
  • you know, i think you might be right, @StephenKitt - but the cost must have been quite low. on the order of $100, not $300-500. The MSDN eventually cost real money - but that gave you access to this unending supply of CDs with tools and operating systems and stuff. I have to think about this ... – davidbak Nov 18 '22 at 18:00
  • 1
    @JamieB on top of the decent tools (well, MS C was a pain to use, but Visual C++ was quite nice even in 1993), there was also Charles Petzold’s great Programming Windows books which helped lots of developers get started. – Stephen Kitt Nov 18 '22 at 18:27
  • @davidbak Programmer’s Shop catalogs indicate $500 retail, around $350 in practice. (BC++ was $495 retail.) The OS/2 SDK cost $3,000… – Stephen Kitt Nov 18 '22 at 18:28
  • @StephenKitt - huh. how did I afford it then? wonder what I told my wife ... – davidbak Nov 18 '22 at 21:32
  • Sadly Programming Windows also started the tradition of miserable looking/hard to maintain Windows code - as, due to the fact that he was writing code samples for a book which everyone else copied as the right way to do things meant that we had over a decade of programs where all message handling was in a single long switch statement for each window ... sigh. Wasn't really his fault I guess ... – davidbak Nov 18 '22 at 21:34
  • @davidbak I’m just sharing what I found, not saying that that’s the whole story. – Stephen Kitt Nov 18 '22 at 21:39
1

By the nature of network effects, there was always going to be a single dominant OS. Nobody at that time was able to pull off 100% compatibility with a complex GUI and its ABI. (DOS was still simple enough that there were at least three.)

It wasn’t going to run on proprietary hardware based on the Motorola 68K (like the Mac or Amiga), because that was more expensive and incompatible with existing software and third-party hardware. That was even more true of UNIX workstations. It wasn’t DesqView because that had no GUI and the company was too small to make a big splash.

Many people at the time expected it to be IBM’s OS/2, but IBM turned out to be so bad at selling it that its own personal computers shipped with Windows. It also did a poor job of prioritizing the features customers actually wanted. Most egregiously, early versions could not print anything and had no GUI. By the time IBM got OS/2 into a usable state, they had to bundle a copy of Windows with it to make it compatible. By then, Microsoft had won.

Davislor
  • 8,686
  • 1
  • 28
  • 34