31

Windows NT implemented POSIX compatibility because some US government contracts required such. It is said that the POSIX implementation was only pro forma, not intended or suitable for real use (i.e. Microsoft hoped the customer would accept the operating system, and then run NT-specific software, thereby giving them a moat against competing platforms).

In what specific ways was the POSIX implementation unsuited to real use?

rwallace
  • 60,953
  • 17
  • 229
  • 552
  • 6
    Would you mind to cite any sources for either (or best all) assumptions? (Being just pro forma (and what that should mean), being not intended for use and being not suited - bonus for how such contradicting assumption can be all true at the same time) – Raffzahn Jul 13 '23 at 09:33
  • @Raffzahn 'Just pro forma' refers to something that is formally provided, but not meant to be actually used, so it's actually just one assumption – or, not assumption, but something gathered from what I've read on the topic. It's difficult to provide a formal reference, because it's not the kind of thing discussed in formal sources, but there is some less formal discussion in some of the replies to https://www.quora.com/Is-Windows-POSIX-compliant – rwallace Jul 13 '23 at 09:47
  • @Raffzahn e.g. "However, MS has undertaken some half-hearted attempts at POSIX compliance, including the Microsoft POSIX subsystem - Wikipedia - giving them just enough wiggle-room that their offerings can be sold to the U.S. Federal Government departments which require POSIX compliance." – rwallace Jul 13 '23 at 09:47
  • 3
    Well, I know what pro forma means, but you may want to explain what it should mean in context of an API. Also, still see here 3 independent assumptions made: 1. Being 'pro forma', 2. not suited for use and 3. not intended for use. " because it's not the kind of thing discussed in formal sources" isn't that a very strong hint for being unfounded opinion? But even so, It would help the question to cite at least that brabling (BTW, there is already an in depth answer on that site debunking that question). – Raffzahn Jul 13 '23 at 10:11
  • 1
    @Raffzahn Very well, consider it explained! As for 'unfounded opinion', I think there is an annoying tendency to give up far too easily on being able to piece together a reasonable picture of the truth about historical questions; I think this one should be answerable. https://retrocomputing.stackexchange.com/questions/1083/how-posix-compliant-is-xenix is along similar lines... – rwallace Jul 13 '23 at 10:48
  • @Raffzahn Do you mean Clem Cole's answer? That is interesting indeed, and does make a case for the converse position. – rwallace Jul 13 '23 at 10:49
  • Not really. a) The cited question is not only about an OS from before POSIX, but also asks especially about the compatibility of XENIX with POSIX (though without naming what exact part or version is meant). Windows is certified POSIX compatible, so there is no open question. b) Making a claim does not per se create creditability. There are people claiming earth is flat. Any claim needs support - something not given here in any way. – Raffzahn Jul 13 '23 at 11:05
  • 2
    In my experience it was just a mass of bugs. We failed to get anything successfully ported to it. – Chenmunka Jul 13 '23 at 13:44
  • 4
    If my vague memory serves me right, the NT Posix subsystem wasn't really that bad. But the implementation in MS's development tools (which literally is not required to make an OS Posix-compliant) was less than half-baked. It was really clear they wanted to make it as difficult as possible to implement for the Posix SS. – tofro Jul 13 '23 at 14:26
  • 2
    Note that even though the POSIX subsystem was considered minimal/difficult to use/whatever as discussed in the answers it nevertheless required serious work with implications elsewhere. For example: Unlike previous Microsoft file systems NTFS supported case-sensitive filenames in addition to case-insensitive - and this had implications to the Windows subsystem too. Still does. – davidbak Jul 13 '23 at 14:43
  • 2
    The native 'create process' call (probably NtCreateProcess, but it's been too many years since I cracked open my copy of the NT native API) has provision for passing in an existing address space reference. This presumably was to allow 'fork' to be implemented; there's no use for that capability in the Win32 API. – dave Jul 13 '23 at 23:21
  • Maybe it was not intended for real use, but only to satisfy government red tape. – Neil Meyer Jul 15 '23 at 14:06

5 Answers5

41

In what specific ways was the POSIX implementation unsuited to real use?

It wasn’t unsuited, but it’s important to understand what was certified. In order to meet requirements for certain US government purchases, laid out in FIPS 151-2, Windows NT’s POSIX subsystem implemented POSIX.1, i.e. the C-level API. It was usable as-is, but actual usability was limited to what POSIX.1 provided; in particular, no end-user tools were included apart from pax, the POSIX archiver. The POSIX subsystem allowed POSIX programs to be compiled and run on NT, and that’s all that was needed.

Nowadays we tend to think of POSIX compliance as involving both APIs and user-level utilities (POSIX.2); to get that on NT, users had to pull in a third-party tool at first — OpenNT (later Interix). In 1999, Microsoft acquired Softway Systems, the developers of Interix, and made Interix available as an optional, free component for NT, Windows Services for UNIX (version 3.0 and later). This was ultimately discontinued.

It’s also worth noting that subsystems on Windows are silos, so the POSIX subsystem doesn’t provide any features accessible to “Windows” programs (using the Win32 subsystem). The POSIX APIs on Windows NT were only available to programs running on the POSIX subsystem, and programs running on the POSIX subsystem couldn’t access Win32-specific features (e.g. Win32 ACLs).

Stephen Kitt
  • 121,835
  • 17
  • 505
  • 462
  • 1
    Citing facts is helpful, but it may as well help to add how this related to the claims made, which is the core of that question, isn't it? – Raffzahn Jul 13 '23 at 11:07
  • 8
    Perhaps a line or two explaining the consequences of being not just 'an API' but 'a subsystem' in the NT model. A (non-native-NT) program runs under one subsystem, it can't pick and choose. Apart from that, this answers the question just fine: the programming environment that was implemented was not particularly rich. – dave Jul 13 '23 at 11:28
  • 2
    Windows Services for Unix was actually powerful enough for me to port the (at the time) cygwin-based Firefox build system to it. I did have to pretend I was cross compiling from Interix to Win32 though. – Neil Jul 13 '23 at 20:56
  • 3
    Where can I read more about Windows subsystems? Trying to google it just gives a zillion results for Windows Subsystem for Linux and a few hits for Windows Subsystem for Android. – user2357112 Jul 13 '23 at 21:22
  • 2
    The book Windows Internals is a good source for information about the subsystems (there was also an OS/2 subsystem that was removed in, I believe, 4.0). Wikipedia has a half-decent writeup of the different subsystems that shipped, stuck in the middle of its article on NT architecture: https://en.wikipedia.org/wiki/Architecture_of_Windows_NT#User_mode – Chris Charabaruk Jul 13 '23 at 22:17
  • 2
    You can read about Windows subsystem concepts in the design documents of the late lamented Mica OS for DEC's PRISM hardware, principal designer D. N. Cutler. :-) – dave Jul 13 '23 at 22:46
  • 1
    After the demise of the OS/2 subsystem and with the need to really accelerate the graphical environment by reducing overhead due to architectural structure, the subsystem mechanism as original built sort of ... went away. The structure wasn't maintained. The subsequent WSL isn't based on it at all - completely different technological base. – davidbak Jul 13 '23 at 23:24
  • 1
    The only OS I've used that did the 'multiple personality' thing successfully was RSTS/E :-) – dave Jul 13 '23 at 23:28
  • 1
    Stephen makes the point about subsystems (or APIs) being siloed, but an interesting detail is that in (I think) the NT3 era Win32 gained symlinks for files- but not directories. I've got a very vague recollection that INND (v1) was somehow involved there. – Mark Morgan Lloyd Jul 14 '23 at 20:25
  • 1
    @MarkMorganLloyd perhaps there was another Win32 API before that, but the current symbolic links Win32 API was added in Windows Vista. NTFS supports symbolic links since version 3.1 (Windows XP). symlink is only part of POSIX since Issue 4, Version 2, published in 1994; as far as I’m aware NT only implemented the 1990 version of the standard. – Stephen Kitt Jul 15 '23 at 09:04
  • 1
    @StephenKitt I've been trying to think back. I can't remember whether there was an INN port in the Reskit of the era, but it was of interest to me since I was using INN for archiving messages of various types in I think much the same way that Larry Wall describes using bnews (?). This might have been at a trade presentation at which MS also announced providing the NT source to selected academic groups... I can't put a date on that but I think it was either the late v3 or early v4 era. – Mark Morgan Lloyd Jul 15 '23 at 09:36
  • 2
    For information about the requirements for Windows NT and the use of the subsystem architecture to meet them, you should look for a copy of Inside Windows NT by Helen Custer 1993 with a forward by Dave Cutler. (It's totally different to the later Solomon/Russinovich Inside Windows 2000 and their later books which are more technical.) There was a need to run applications from 16-bit Windows, MS-DOS, POSIX and OS/2 environments as well as providing the new Win32. The approach taken was similar to Mach (which supported incompatible Unix implementations.) Anyway it's a good read! – AndyK Jul 15 '23 at 10:23
19

You would be interested in this video, in which the vlogger installs the POSIX subsystem for Windows NT 4.0sp6 (after previously having tested the version for NT 3.1). It’s a summary of a livestream of installing POSIX and attempting to run it. He concludes that it did the bare minimum to check off the boxes for the NIST standards, required at the time for federal government contracts. (This is debatable; see below.) he does not install the full Services For Unix (SFU), or the Interix product that Microsoft later bought and made available for download.

The only POSIX utility supplied with Windows was pax, the POSIX archiver, able to unpack .tar and .cpio archives. Several other utilities were supplied with the Windows NT Resource Kit (which was sold as a CD-ROM with a set of reference books, and not widely-publicized as being necessary to use POSIX). To get the subsystem running, the vlogger was able to find old documentation about disabling incompatible NT features, configuring environment variables, and creating a termcap file. At that point, he needed to create aliases like cat for CAT.EXE, which were not provided out of the box, and run sh and vi. (One of the flaws is the complaints about behavior that is, in fact, correct, such as POSIX filenames being case-sensitive.)

To do anything else with it, you would need the Windows NT System Development Kit, as well as Visual C++. Several development utilities are shipped only as source, such as less and make. These did not compile as-is, but needed some minor fixes. With a bit of work, it is possible to build a working cc.exe, which wrapped the Visual C compiler in a UNIX-like command-line program that produces POSIX binaries.

At that point, the vlogger tests various UNIX software, and concludes that the support is too rudimentary to be useful. In fact, he says he doubts anyone actually managed to do anything with it in the ’90s, or that it was even intended to be used. The shell could not run configure scripts. All POSIX-subsystem programs are sandboxed and have no access to the Win32 API, or to graphics, memory-mapped files or networking. Only 110 API functions are supported, enough to compile and run vi, but not anything much more complicated, such as rogue or even adventure. The vlogger was able to compile trek from the BSD games collection, for both 386 and MIPS, but that was it.

Arguably, he was being too harsh, perhaps due to unfamiliarity with how UNIX worked at the time. For example, he was testing the POSIX subsystem with source code that wasn’t POSIX compliant. POSIX is specified to behave like UNIX System V, not BSD UNIX. Back in the day, I wouldn’t have expected BSD source to compile on AIX with the default settings. Even today, on Linux, those programs might not compile without -D_BSD_SOURCE. I recall needing to make minor tweaks to a large number of BSD programs, such as changing <strings.h> to <string.h> or #define strcasecmp((s1), (s2)) strcmpi((s1), (s2)). The compile errors with at least adventure seem to fall squarely in that category. Some of the other error messages that frustrated him seemed clear to me (such as the ones telling him to set an environment variable). It was common for OSes like IRIX and Solaris to add some compatibility for BSD, but what he’s trying to do would not have worked on many versions of UNIX on the market back then.

Davislor
  • 8,686
  • 1
  • 28
  • 34
  • 9
    In theory it's useful to have this kind of report from someone who actually used the POSIX subsystem, but the author of this video is openly anti-Microsoft and seems to have limited knowledge of Windows. E.g. he describes it as "obscene" that he has to install Visual C++, the Windows SDK, and the Resource Kit to compile software for the POSIX subsystem, even though those were normal things for Windows developers to install. I don't know how many of his problems were due to unfamiliarity with the platform plus a lack of motivation to solve them since ultimately he wants to paint MS as bad. – benrg Jul 13 '23 at 20:39
  • 4
    @benrg As I mentioned, he also got frustrated about things that were correct for POSIX, such as case-sensitive filenames. The result was perhaps unduly negative. Some of those error messages also look like things that someone more familiar with UNIX could have fixed. The MAKEPATH error message, for example, seems to be clearly saying to set an environment variable, although I’m not sure if he tried it. – Davislor Jul 13 '23 at 21:02
  • 6
    As someone who has to work with AIX on a semi-regular basis, I still don't expect anything to work on it. – A. R. Jul 14 '23 at 21:06
  • @benrg: Actually I consider that cc shenanigans obscene. There should have been a real cc, as, ar, and ld in the environment that runs independent of a Win32 service. On the other hand, having cc pull files from the SDK wouldn't have been a major issue. – Joshua Dec 04 '23 at 22:37
7

Having tried to use it, I found it had a major fault that rendered it unusable.

Delete file while in use did not function; and quite a few applications from the era expected this to work; with them not working, the multi-userness was reduced to single user if the applications worked at all.

As far as I can tell, the network was not accessible either.

As an aside note, the MS implementation made it impossible to implement POSIX.2 (shell scripting) because /usr/bin wasn't a valid path, so #!/bin/sh didn't work and #!/usr/bin/env sh didn't either. People would have been hopping mad if they had to type #!/usr/bin/env sh but at least it would have been portable to any POSIX.2 system (there were a few systems that tried to move the shell long ago; mostly it didn't work too good). The path to getopt didn't work for the same reason.

Joshua
  • 1,829
  • 14
  • 22
  • 1
    If an underlying file system doesn't support such an ability, I don't think a POSIX layer can add one. What would even "real" Unix do if a program attempted to use such functionality on network drive on a server which couldn't delete a file that was open for access by anyone? – supercat Jul 13 '23 at 19:40
  • 4
    @supercat: That's the point. It's effectively mandatory functionality and it wasn't provided on any filesystem. The NTFS filesystem it technically capable of it, the NT driver isn't. But in fact your question has an answer. Sun's NFS filesystem didn't implement it on the server side but rather in kernel on the client side. Other workstations saw ghost .nfsXXXXXX files, which caused issues. (Trying to implement it in user mode does not function.) – Joshua Jul 13 '23 at 20:06
  • 3
    The whole mess around 'file deletion' on NTFS seemed to be a hack for FAT compatibility, where the directory entry is the thing that allows you to get to the file, so FAT never had the capability to remove the directory entry and keep the file around until closed. I really wish they hadn't bothered to be 'compatible'. – dave Jul 14 '23 at 01:34
  • 3
    "quite a few applications from the era expected this" - sure, but what did POSIX say? That's the thing with standards. Either they say something has to behave in a standard way, or the implementation has the freedom to choose. – MSalters Jul 14 '23 at 06:57
  • 5
    @MSalters: The answer to that question would only tell us whether they "cheated" in getting POSIX certification, or whether many Unix programs depended on functionality that POSIX failed to actually specify. Either way, it was part of the functionality that normal programs expected. – Peter Cordes Jul 14 '23 at 16:14
  • @MSalters: Another possibility is that standards may say that implementations "should" do something, with a strong implication that implementations which fail to do so should generally be viewed as being of poor quality, but recognizing that in some cases having a poor quality implementation which is usable in e.g. a resource-limited context may be more useful than having no implementation at all that is usable in that context. – supercat Jul 14 '23 at 17:57
  • 1
    The POSIX standard did not require network access to be provided, it also did not define that an open file can be deleted etc. It was not just NT, also VMS had POSIX surport but nearly all source code that claimed to work on POSIX assumed it was Unix under the POSIX layer. – Ian Ringrose Jul 15 '23 at 23:04
  • @IanRingrose: "nearly all source code that claimed to work on POSIX assumed it was Unix under the POSIX layer." That's the point of the POSIX specification; the standardized API surface for Unix systems. – Joshua Jul 15 '23 at 23:11
  • 1
    @Joshua the POSIX standard claimed it was independent of Unix and sold to the US government as a standard all operating systems could implement. – Ian Ringrose Jul 15 '23 at 23:14
3

TL;DR: Those claims do not add up.

The whole issue is, to use nice words, an urban myth. It ticks all the boxes about MS being an evil conspiracy but does not deliver any validation, neither references nor concrete claims making it open to speculation by interpretation.


An API can not be implemented and not implemented at the same time.

Being an API is the key part here. POSIX stands for Portable Operating System Interface and is an API. POSIX isn't some OS or a certain OS or kernel design and especially it's not UNIX. While it's based on Unix and tries to unify the user side API of various Unices, it can be implemented with any OS (*1).

POSIX implementation is also not a name game, but certified against a fixed catalogue. It's a point of existing or not. For that it doesn't count if the POSIX API is the only one available or even the one used most - or in what way it is implemented. No matter if implemented in basic system calls, a system library or a user side library. All that counts is that programs using POSIX functions in a way POSIX defined them can be compiled and executed in that environment.

Not defining the way of implementation is exactly what makes POSIX a portable interface definition.

Windows offers the POSIX API out of the box. The fact that other API also exist is meaningless - likewise what API an application uses. Windows is eventually among the OS least divergent from Unix, considering that true classic mainframe systems with zero commonality to Unix, like OS/390, z/OS or BS2000, are fully POSIX certified since the 1990s.


Windows NT implemented POSIX compatibility because some US government contracts required such.

MS adding it just for "some US government contracts" sound far fetched and more like trash talking than any serious angle. POSIX was a major buzz in professional computing the 1990s. A must for anyone who wanted to stay in Business.

It is said that the POSIX implementation was only pro forma,

Not sure what 'pro forma' could mean in context of an API. Either it is implemented according to POSIX specification or not. There is no inbetween (*2).

not intended

Intention is a very debatable point, rather opinion based, without any valid statement of MS themself or reliable proof thereof.

or suitable for real use

Which as well does need at least some reasoning why a provided, certified API should be suitable for 'real' use

(i.e. Microsoft hoped the customer would accept the operating system, and then run NT-specific software, thereby giving them a moat against competing platforms).

What hodgepodge should that be? Requirements are made ant meet, otherwise no procurement. So the question is not what MS hopes, but what the requirements are. Was there a requirement that some application, provided by MS uses that API, or was it that the API should be present? In the later case, it doesn't matter what API the application software uses. Hard to see anything relevant here.

In what specific ways was the POSIX implementation unsuited to real use?

That's an extreme assumption without any source and specification.


*1 - Provided that OS does offer the basic capabilities the API requires.

*2 - No, implementing only parts (like .1 or .1a, etc) is still implementing and valid for certification.

Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • 5
    You seem to be overlooking that it was not just 'an API', it was 'an NT subsystem'. If you're running in the Posix subsystem, you don't have access to Win32 APIs as well. – dave Jul 13 '23 at 11:30
  • 2
    @another-dave Sure, but what's your point? POSIX is an API. Creating an enclosed environment is a way to implement it. POSIX does not specify any of that. Same way as it does not specify how the OS is to be managed and so on. Unless the requirement is that the application must use both at the same time - and if, then POSIX can quite well be handled by user side libraries (Cygwin, etc.) – Raffzahn Jul 13 '23 at 11:52
  • 4
    I can think in a couple of ways to completely implement something like the POSIX API while discouraging any real-world use, making it a pro-forma implementation: You could, for instance, make it slow, limit its access to system ressources like memory or disk space, or make it hard to configure or annoying to use. – Michael Graf Jul 13 '23 at 12:03
  • 1
    Since the point of being POSIX compliant was that one could take a random C program from somewhere that adhered to the POSIX standard and it would compile and run on the target OS. The POSIX standard does not force programs to also be able to take advantage of any other OS support, including the Windows API. So, on NT one could (and did) port their POSIX compliant C programs to NT with minimal to no fuss. – Jon Custer Jul 13 '23 at 12:36
  • 5
    I think being qualified to reveive government orders is quite a bit more more than a knicknack. – tofro Jul 13 '23 at 14:55
  • @MichaelGraf I have a hard time to see any business reason why a company would waste money to implement an interface they don't want and then invest additional money to make it perform bad. Beside the fact that quality (performance) is not really related to success - success of Windows being the best example. Heck, an argumentation like this sounds like a real low quality conspiracy theory ... MS wants to destroy all that is good or something like that. – Raffzahn Jul 13 '23 at 16:40
  • 5
    To be clear, I think it unlikely that Microsoft invested additional money to make NT POSIX work badly, but I have no difficulty believing that they didn't bother investing additional money to make it work well. We all know that once you have something that kinda sorta works, it takes additional effort to make it work well, and there are always other things that could be done with those resources. – rwallace Jul 13 '23 at 19:44
  • 3
    @Raffzahn — What rwallace said: When you need POSIX compliance only to tick off some box for a major cutomer, you can go for the most naive, minimum effort implementation that passes the necessary test. No need to optimize anything, no need to make it work /well/, or be user friendly. Besides, we are talking about a company that, at around the same time, made Windows 3.1 throw fake error messages when running on top of DR DOS. – Michael Graf Jul 13 '23 at 21:03
  • @rwallace But it's the base of the made assumptions that it was made bad on purpose. Underlined by the claim all that investment was for a single customer, when in fact at that time next to every OS manufacturer (from Apple to IBM) added some level of POSIX compatibility to their OS. All for that single project? come on. The whole thing is a classic compilation of 'facts' ticking the boxes of conspiration theorists, knowin well that MS is concentrated evil. – Raffzahn Jul 13 '23 at 21:48
  • 1
    @raffzahn - this is a question about how NT's actual posix implementation ws unsuited to real use, not about how someone could have done it differently. My opinion is that it's due to two factors that limited it: posix.1 only, and implemented as a subsystem. – dave Jul 13 '23 at 22:39
  • 2
    A related POV would be the POV of the customer of these NT systems with POSIX API compliance: Did they want POSIX compliance to be able to run POSIX-compliant programs? Or did they want POSIX compliance to be able to buy Windows NT systems with funding from government contracts? Nearly always the latter, I think. So it's not just that Microsoft didn't care to provide a good implementation. Their customers didn't care either. (The ones that did care bought Unix.) – davidbak Jul 13 '23 at 23:28
  • 1
    @davidbak Quite a valid line of thought - I'd even go a step further and eliminate the government part here. POSIX was a buzzword for professional/enterprise buyers during the 90s. Manufacturers used it to make their systems look competitive (especially all the Unix alike ones trying to reduce canibalizing each other while loosing ground against windows), while management on customer side used it as key word for buying decisions - as so often without any idea what it's good for. – Raffzahn Jul 13 '23 at 23:49
  • 2
    I think "most programs targeting posix cannot run on it because they followed common conventions that strictly speaking aren't required by POSIX" would be exactly what classify it as "certified but not useful for practical use". Sure an implementation might pass a POSIX certification, and that might allow it to used for certain contracts, but for practical use what actually matters is whether or not you can run the programs you want on it, not whether or not it passes some theoretical specification. – Lie Ryan Jul 14 '23 at 05:53
  • It just seems like a minor feature that worked OK and was considered important at the time but then company resources were used elsewhere. – Neil Meyer Jul 15 '23 at 14:09
3

I was working on porting unix software to windows NT at the time.

Strict POSIX was never a useful standard as it did not include network access or graphical outout. Unix systems that supported POSIX did not prevent the use of functionality that was not defined by POSIX.

Both Windows NT and VMS make it practically impossible to use functionality that was not in the POSIX standard from a program running in the POSIX subsystem. But there was no point in running software on Windows NT if it could not display graphics or use the network.

Ian Ringrose
  • 425
  • 3
  • 6