28

Suppose it's late 1980 to early 1981, you've got some software you want to write for the IBM PC which will be released later in 1981, want to get started ASAP, and believe C is the right language for the job. However, a C compiler on the PC does not become available until 1982.

What's the cheapest way to get your hands on some kind of usable C programming environment immediately?

wizzwizz4
  • 18,543
  • 10
  • 78
  • 144
rwallace
  • 60,953
  • 17
  • 229
  • 552
  • 13
    Note that the IBM PC wasn't announced in the time frame you're talking about. You'd be targeting computer you'd only know by rumours, without any detailed knowledge of its hardware or software interfaces. –  Jun 27 '17 at 16:44
  • 14
    @RossRidge has a fair point. The best way to get started on writing a PC application in 1980 or 1981 is to start writing a CP/M application and port it when the PC is released. – Jules Jun 27 '17 at 20:05
  • 14
    C in the 80ies was largely assumed "an exotic thing only used by funny Unix guys" and had by far not the popularity that you seem to assume. High-level-language programs were mainly written using BASIC and Pascal (even BCPL was more popular than C) compilers, or, even more common, not in a high-level language at all, but in assembly. – tofro Jun 28 '17 at 06:21
  • 3
    I don't know about "cheap". FWIW, the first version of AmigaOS was developed (in C, with some assembly) in 1982-1984, roughly the same timeframe. Amiga hardware simply didn't exist at that time. The developers at Amiga Inc. used cross-compilers, emulating the hardware on a SAGE IV. I would assume that was a rather common way to develop for up-and-coming platforms: As the platform doesn't exist yet, you cannot be self-hosting, thus you rely on cross-compiling. (Actually AmigaOS did not become fully self-hosted for many, many years.) – DevSolar Jun 29 '17 at 14:28

10 Answers10

43

So you want to write a C program for the IBM PC before the first C compiler for the PC is released. How do you go about it?

There are three options I can see:

  1. Write your own C compiler
  2. Use a cross compiler for 8086 on some other platform
  3. Wait for a C compiler to become available
  4. Don't use C

No four options. Amongst our many options are fear, surprise and a fanatical devotion to the Pope.

I'd be pretty confident that all of the software houses that were in that position would have gone with option 4. The reason for this is that the premise of your question is faulty: professional software developers don't have dogmatic views about which language to use, they look at the target platform and then pick (what they think is) the best language that is available. In my 30 years as a professional software developer, I have never had the luxury of a free choice of what language to use on any project. On a professional level, my favourite language is whichever one gets the job done on the target platform. For my personal projects, it's Swift all the way.

Another point to remember is that, in 1981, C did not have the ubiquity in the world of personal computers that it later gained. Everybody wrote in BASIC or assembler. The deficiencies of BASIC were recognised but the answer was not obviously "replace it with C". Several languages were mooted as a replacement including IIRC Pascal, COMAL and even Fortran but the first time I even heard of C was when I went to University in 1984 and was exposed for the first time to Unix. There was a significant group of people who thought "why would you use anything but assembler?"

The software house developing an application for the IBM PC in 1981 would have had no hesitation in breaking out the 8086 assembler and just getting on with it.

JeremyP
  • 11,631
  • 1
  • 37
  • 53
  • 26
    Also, in 1981, Pascal was far more common than it is today - it's surprising how many executables from that era, when hexdumped, show signs of having been written in one or another version of Turbo Pascal. – Jeff Zeitlin Jun 27 '17 at 11:24
  • 5
    @AnoE Thank you for your comment, but I did answer the question as written. I gave three alternatives to not using C. Two of them did not really need expanding and the other (cross compiling) has been dealt with by other answers. I think this web site benefits by also discussing questions from a slightly different point of view occasionally.. If my answer was the only answer, you might have a valid point. but it is not. – JeremyP Jun 27 '17 at 12:57
  • 2
    @JeremyP, no problem, I'm happy to agree to disagree. – AnoE Jun 27 '17 at 13:49
  • 3
    To the person who suggested the edit to the "Amongst... " line, sorry I rejected most of it, but the line is based on a quotation and I had it correct. except for the punctuation. – JeremyP Jun 27 '17 at 15:58
  • 1
    "based on a quotation" - some of us read that line with Micheal Palin's voice in our heads. I'm kinda shocked that somebody didn't recognize it. – Martin Bonner supports Monica Jun 28 '17 at 11:34
  • 3
    @MartinBonner When I wrote the first draft of the answer, I had three bullet points only. After proof reading and just before posting, I thought of "wait for a C compiler to become available" and added it as option 3. (That, btw is a perfectly reasonable choice, especially if you know one is about to be released). After posting the answer I noticed I hadn't changed the line above to say "four" and Michael Palin, Terry Gilliam and Terry Jones immediately and surprisingly popped into my head. I must say I wasn't expecting the Spanish Inquisition. – JeremyP Jun 28 '17 at 13:46
  • 1
    @JeffZeitlin Though note that Turbo Pascal itself wasn't released until 1983. – David Richerby Jun 28 '17 at 14:52
  • @DavidRicherby - I'll have to check that; I thought that TP for CP/M, at least, predated that. Not impossible that I'm remembering wrong, though. – Jeff Zeitlin Jun 28 '17 at 15:12
  • @JeffZeitlin Case in point about Pascal's availability: when Donald E. Knuth (at Stanford) decided in ~1980 that he had to rewrite his TeX program (that had been written in the SAIL language) into something that was maximally portable, he chose Pascal (after writing his own WEB system partly to overcome the deficiencies of Pascal). It was possibly one of the most portable (and ported) programs of its size and complexity at the time—the early TUGboat issues are full of reports of different people having ported the program to their installations. – ShreevatsaR Jun 29 '17 at 22:36
  • @JeremyP wrote "professional software developers don't have dogmatic views about which language to use, they look at the target platform and then pick (what they think is) the best language that is available" I am seriously trying to decide whether you are joking here. I cannot go anywhere without having decisions tainted by software developers (professional or otherwise) who decide based on dogmatic views about the language. I usually do not encounter developers who pick the best or most appropriate language, rather the language they root for as one roots for a sports team. – Aaron Jun 29 '17 at 23:19
  • I thought the Pope was more into Assembly language. – Wossname Jun 30 '17 at 09:11
  • @Aaron Then they are unprofessional. – JeremyP Jun 30 '17 at 10:21
  • 5
    @Wossname No the Pope really likes C. I read somewhere that, in fact, he is a C-aholic. At least I'm pretty sure that's what it said. – JeremyP Jun 30 '17 at 10:23
  • @JeremyP, oh boy that was a real Hail Mary. – Wossname Jun 30 '17 at 12:40
  • 1
    Pascal was preferred over C in academia because the convention was to pop the arguments to a procedure as part of the return (8086 ASM: RET 6). C requires the caller to pop the arguments off the stack after return. Makes Pascal marginally faster than C, hence it's popularity on 1, 2,4 and 4.77MHz processors. Pascal compilers were written to protect the programmer from his/herself and crashing the OS(array bounds checks, addressing exceptions, buffer overruns etc), This was (and still is) considered A GOOD THING. Strange that so many security exploits rely on such vulnerabilities. – ChrisR Jun 30 '17 at 16:40
  • 3
    @ChrisR Actually C has always been faster than Pascal. Pascal mandates bounds checking and other sanity checks, as you say, which is a much bigger drag on performance than when you pop function arguments off the stack. Pascal was preferred as a teaching language to C because it is a better teaching language. The syntax is less confusing for beginners and it has all those safety features built in. – JeremyP Jul 02 '17 at 14:06
  • @JeremyP Perhaps that makes them ethically unprofessional, yet the portion of paid engineers who do this is, in my experience, very high. In both current and previous jobs, there are plenty who insist on C++ for everything, a few who insist on Python for everything, and similar for other languages/techs. A previous manager of mine who dissed Java and Python even said that he sees Java & JVM langs as a fad dying in a decade and C++ taking its rightful place as industry standard and universal. My encounters with technically&linguistically open-minded engineers are the exception at multiple jobs. – Aaron Jul 05 '17 at 14:46
  • @Aaron You have been unlucky then. I certainly have met people who are the way you describe, but in my experience, they are the minority. – JeremyP Jul 05 '17 at 15:00
  • People have been saying that Java is a fad since the mid-90s. One of these days, they are going to be right! @Aaron – Cody Gray - on strike Jul 10 '17 at 07:58
31

One option might have been using “Small C,” which was published in 1980 in Dr. Dobb's Journal magazine.

Initially it generated code for the 8080, but was adapted for a few other CPU's. It was adapted to generate code for DOS/8088 but I do not know the date.

Small C was written in itself so you would need a CP/M-80 system to do the port.

But if I recall correctly, at the time, most things were done in Basic or assembler. There may have been a Forth implementation but that was always a niche language.

J F
  • 103
  • 4
mannaggia
  • 3,264
  • 2
  • 16
  • 15
  • 1
    Small C output 8080 asm. And there may have been 8080 to 8086 asm translators available in the 1980 time frame. – hotpaw2 Jun 28 '17 at 02:13
  • 2
    The original Small C did output 8080 code but there were ports to output 8086, I just don't know when it was done. And since Small C source was available, doing that port was feasible. – mannaggia Jun 28 '17 at 02:24
  • It was written in itselft? Oh no, chicken and egg. Which one came first ;) – Simon Jul 02 '17 at 06:52
  • 1
    @Simon I presume you’re just having fun there but if not it’s a fascinating thing, bootstrapping. gcc does this as well and Ken Thompson of course would write an ingenious way this can be abused in his piece Reflections on Trusting Trust. And I agree with the idea that trying to figure out how it works is an excellent exercise. I already knew how but it’s still a fascinating concept. But to extend your question which came first the C compiler or the C compiler (and then there are compiler compilers). – Pryftan Mar 04 '18 at 23:57
30

The BDS C compiler was released in 1979, ran on CP/M, and was capable of generating code for the Intel 8080 microprocessor. (It also ran on and generated code for the Zilog Z80, but that's not relevant here). This was a very popular, well-known CP/M compiler, and as Wikipedia says:

It ran much faster and was more convenient to use than other Z80-hosted compilers of the time. It was possible to run BDS C on single-floppy machines with as little as 30K of RAM - something of a minor miracle by comparison to most other commercial compilers which required many passes and the writing of intermediate files to disk.
[ … ]
BDS C was very memory efficient, with fast compilation speeds.

Since the 8088/8086 processors used in the IBM PC are largely compatible with the 8080, I believe that using BDS C on a CP/M machine would have been a viable path.

The two processors aren't completely binary-compatible—as in, an 8088 won't run 8080 code as would have been generated by the BDS C compiler. But, the two processors are compatible on the assembly language level, which means that the binary code could have been easily transcoded using an automatic tool, or even by a human assembly-language programmer looking at the 8080 source disassembly. All you needed to do was to translate the opcodes over.

Alternatively, you could have used Ron Cain's Small-C compiler, the source code for which was published in the May 1980 issue of Dr. Dobb's Journal. This compiler also targeted the 8080, but since its source was available (at some point, it was released into the public domain, but I can't find a precise date), you could have modified it to target the 8088 with minimal effort—and I really do mean minimal. Small-C generated assembly code as its final output, which then had to be translated into machine code by an assembler, so all you really needed to do was plug in an x86 assembler.*

Either way, this would allow you to write and debug all of your C code on the CP/M machine, meaning that you wouldn't need access to any IBM pre-release hardware (which wasn't exactly forthcoming; the PC was basically a skunkworks project, kept secret from most of the rest of the industry). CP/M machines were very affordable at this time, and there were plenty of them to choose from. If you wrote reasonably portable C, the porting would have been absolutely trivial. And then, once a C compiler was eventually released for the platform (and you knew it was going to be), you could drop the post-compilation opcode-translation step, switching your build process over to, for example, the newly-released Lattice C compiler in 1982, which ran natively on the IBM PC under PC-DOS.

More realistically, though, the only reason you'd even need to do this would be to make sure that you had software ready to run on the IBM PC on the day of launch. But most vendors weren't doing that—no one expected the IBM PC to be the runaway success that it was.* So, what is more likely is that you were a shop developing software for CP/M machines and already using the BDS C compiler. You'd continue doing so until it was obvious that the IBM PC was going to catch on, and then it would be a simple matter of porting your existing C code base to Lattice C or any other newly-released C compiler targeting the IBM PC.

__
* The few vendors who were writing serious software for the IBM PC in 1980–81 were doing it in assembly. Microsoft had an assembler up and running (they had to—they were using it to steal develop DOS), and Intel certainly had one.

There was also some commercial software development done in Pascal. IBM released a Pascal compiler (developed by Microsoft) for the PC in 1981, alongside its August launch, and I imagine that prototypes were available to prospective vendors (though I don't know this for certain). There were also other vendors who had Pascal development environments, and the USCD Pascal system was one of the available operating systems for the IBM PC, in addition to CP/M and Microsoft's DOS.

Cody Gray - on strike
  • 1,554
  • 14
  • 21
  • 4
    Two translators from 8080 to 8086 assembly were XLT86 from Digital Research, and CONV86 from Intel, as mentioned by Hans Passant at this SO answer: https://stackoverflow.com/a/32414213/371250 – ninjalj Jun 27 '17 at 11:30
  • 3
    That's an interesting idea, translating the assembly code, assuming the compiler generated assembly instead of binary code. The Small C compiler I mentioned in my answer did generate assembler code. However, in addition to translating the assembly code, the BIOS and DOS calls would need to be handled too. CP/M-80 used simple JMP opcodes, where the IBM/8088 used INT interrupt vectors. – mannaggia Jun 27 '17 at 15:14
  • @mannaggia You can always disassemble it if it's in bytecode - even a regex(p) could probably manage that job as it's mostly just substitution. – wizzwizz4 Jun 27 '17 at 16:14
  • 5
    @mannaggia PC DOS supported (and I think early versions officially supported) the CALL 5 system call mechanism of CP/M, even though it was deprecated. From what I recall, that was specifically done for compatibility with machine-translated CP/M software, though this turned out to not be used much in practice. – user Jun 27 '17 at 20:43
  • 3
    Yup, as Michael says, DOS 1.0 was largely CP/M compatible. But even if it wasn't, all you'd need to do was wrap all of your system calls in a library, then it would be a simple matter of translating the code in that library. Even if you had to write the implementation for these library functions straight in x86 assembly, that would be a minimal amount of work, and they'd still be callable from C. In other words, write a portable abstraction layer! Yes, it's work, but no one ever said targeting a brand new platform was easy! – Cody Gray - on strike Jun 28 '17 at 09:41
  • 1
    @CodyGray Forget brand new and substitute not even introduced yet. We are talking targetting an architecture that won't be publicly available for at least another half year or so (Wikipedia puts the '5150 introduction date as August 12, 1981, and OP specifies "early 1981" at the latest, so let's split the difference and call it mid-February). Never mind the fact that the development of the hardware only began in July 1980. Someone in the position of the individual the OP discusses would likely be privy to quite a bit of confidential information about a still-in-development product. – user Jun 28 '17 at 20:24
  • 1
    The answer says that 8080 and 8086 are compatible at the assembly level so you can just translate the opcodes at the binary level. It's not that easy - the 8080 and 8086 had instruction set differences that sometimes required turning one 8080 instruction into three or more 8086 instructions. (See Appendix A of the CONV86 manual.) Note this changes the code size so branches would all change too. So conversion needed to be done with the assembly source code, not the binary level. – Ken Shirriff Jun 28 '17 at 21:41
  • 1
    @KenShirriff Your concluding sentence doesn't necessarily follow from what precedes it. For example, you could imagine a disassembler that parses the binary code and generates a target table for each memory location referenced, then modifies those as needed when the machine code length changes. Fragile, yes, but not impossible. – user Jun 30 '17 at 10:00
  • I don't think trying to compile code for x86 by compiling for 8080 and translating it would really make sense. If i is an int of automatic duration and arr is an int[], the best 8080 code for arr[i]-- (written with Z80 mnemonics) would likely be something like ld hl,_i / add hl,sp / ld e,(hl) / inc hl / ld h,(hl) / ld hl,_arr / add hl,de / ld a,(hl) / add a,#255 / ld (hl),a / jp nc,gone / inc hl / dec (hl). Even a relatively simplistic 8088 compiler should be able to manage mov bx,[bp+_i] / add bx,bx / dec word [_arr + bx], and even one that made no attempt at optimization... – supercat Sep 08 '21 at 20:37
  • ...would yield something like `mov bx,_arr / push bx / lea bx,[bp+_i] / push [bx] / pop ax / add ax,ax / pop bx / add bx,ax / dec word [bx]" which, while pretty horrible, would still be better than anything that could be produced by translating 8080 code. – supercat Sep 08 '21 at 20:41
10

A possible answer is cross-development. If C is the right language for the job, then a prototype can be written in C on any platform that already has it (say, PDP-11, also a little endian 16-bit architecture, with a well-established C environment).
A command-line program would be portable enough, maybe with a few include file modifications and taking care of the infamous "text mode" vs "binary mode".
If some kind of text mode direct screen access is needed for interactivity, in a prototype implementation it could be modeled by the CURSES library, to be rewritten later with assigning directly to the video memory.

Leo B.
  • 19,082
  • 5
  • 49
  • 141
  • Yes, that could very well be a way to go. But a minicomputer like the PDP-11 is pretty expensive, isn't it, like five-digit price tag? What's the cheapest computer that can be used that way at that time? – rwallace Jun 27 '17 at 05:27
  • 4
    @rwallace You may want to clarify your question, then. One could assume that a person who would need to program for the IBM PC a year before its release and had selected C as the tool for the job, would likely be in an academic or industrial setting with ready access to mainframes or minis. – Leo B. Jun 27 '17 at 05:33
  • 2
    I don't have time to really search now, but there were probably some viable C compilers for CP/M systems. Even starting with an 8-bit (e.g., Z80) CP/M system would have gotten you close to the environment of the IBM PC at a reasonable price. – manassehkatz-Moving 2 Codidact Jun 27 '17 at 05:36
  • 2
    @LeoB. Okay, in this case the people involved (protagonists in alternate history fiction) are running a startup company, they are tight on capital so they don't have ready access to expensive equipment. Some form of academic partnership might perhaps be possible. – rwallace Jun 27 '17 at 05:52
  • 1
    @rwallace - Perhaps an alternate history where Microsoft doesn't take over the world? – manassehkatz-Moving 2 Codidact Jun 27 '17 at 06:32
  • @manassehkatz Perhaps! – rwallace Jun 27 '17 at 06:51
  • 3
    @rwallace I believe that it was possible to buy some CPU time at night for a reasonable price from academic or maybe even commercial entities using remote access, if one had a videoterminal and a modem and lived in an area where a call to the provider would be local, therefore free. – Leo B. Jun 27 '17 at 07:29
  • 2
    PC BIOS was cross-assembled using Intel's ASM86 (http://www.os2museum.com/wp/the-ibm-pc-bios-and-intel-isis-ii/) -- – sendmoreinfo Jun 28 '17 at 07:07
  • 1
    @Leo B - We all ran TI Silent 700s or Olivetti paper based Teletype machines at either 300 or (if you were licky) 1200Baud into the bureaus. On IBM it was golf balls. Videoterminal? That would have been heaven – ChrisR Jun 30 '17 at 17:11
  • 1
    @ChrisR In 1981 there were videoterminals even in the Soviet Union. Granted, also at 300 Baud in most installations (e. g. Moscow State University), but personally I was lucky with 1200 at the Academy of Sciences Computing Centre. I would assume that in the West at the same time videoterminals would be at least as accessible. – Leo B. Jul 01 '17 at 02:07
6

You could do the following:

  1. Compile your code to 8080 assembler.
  2. Use a transcompiler like TRANS (1980) or Digital Research XLT86 (1981) to convert the output from #1 to 8086 assembler.
  3. Use an 8086 assembler to compile the output from #2 to a .COM file.

Partial credit goes to @mcleod_ideafix for his answer to a similar question.

snips-n-snails
  • 17,548
  • 3
  • 63
  • 120
3

C was available on the BBC Micro in the very early '80s.

You did need the third party 68000 Second Processor which was made by Torch Computers. The OS was Torch Unix. This configuration wasn't supported by Acorn but it worked pretty well. This is how I first learnt C.

From memory there was also an add-on card for the Apple ][ which would give access to a C environment.

Chenmunka
  • 8,141
  • 3
  • 39
  • 65
3

You can either use a CP/M machine with an 8080 C compiler and port it when the IBM PC launches (August 1981) and BDS or Whitesmiths produce a compiler.

Or, and this was a common way of working then, you beg/borrow/steal access to a minicomputer with C (maybe Unix, maybe something proprietary) and develop on that.

Rich
  • 215
  • 1
  • 4
  • Welcome to Retrocomputing. Could you provide some examples of software / groups that (were) developed like this? – wizzwizz4 Jun 28 '17 at 06:17
  • 1
    @wizzwizz4 : Atari had a pair of DEC PDP-11/20 systems which were used to cross develop software (ie. games) for their systems. Later they had a number of VAXes which were also used for office email.

    Source: http://www.jmargolin.com/vmail/vmail.htm

    – Tim Locke Jun 28 '17 at 13:07
  • 1
    Not x86, but this game (http://www.mobygames.com/game/zx-spectrum/icicle-works) was written, in C and assembler, using a (Memotech) CP/M machine to compile code for the ZX Spectrum, which had the same Z80 CPU but no disks. – Rich Jul 05 '17 at 04:21
3

Can't comment yet, so I'll add another answer.

Small C has been mentioned.

Translators that would mechanically translate 8080 assembly to 8086 assembly were available, if you preferred not to do the job by hand.

I know people who did essentially this in 1983 or '84 -- hand translated the assembly language source for the compiler and the output object.

Small C could theoretically be used to bootstrap a full C compiler, and that was also done. But it wasn't done as much because it did take real work, in terms of working out all the grammar rules and testing them.

On a little higher-tech level, C compilers (and other language compilers and interpreters) in those days were usually constructed using yacc and lex. So, if you had a budget and someone who understand yacc and lex, you might have constructed a cross-compiler on a cheap (so-to-speak) Unix minicomputer which would then be shared by your dev-department to write the software on.

(afterthought)

Using lex and yacc might make it possible, after cross-compiling the libraries and some other tools and lex and yacc themselves, to use the source code for the cross-compiler to bootstrap a native compiler.

That was the holy grail, of course, but it didn't always work that well.

The reason you might not want to use the source code for the Unix compiler, if you had that, is that you wouldn't want the full set of bells and whistles on a machine as limited as the IBM PC.

On the converse, some managers preferred not to have native compilers because that would be one more asset of unknown value to keep track of.

(end afterthought)

And, finally, if you had early access to the hardware or a mockup of some sort, you might have had early access to the compilers that had not yet been officially released.

An option that the Forth community often mentioned back then, but never actually seemed to be used, might have been to write the compiler in Forth.

Speaking of Forth, it is my understanding that the early versions of WordPerfect were written in a Forth that the predecessor to WordPerfect Inc. wrote themselves. If you understood Forth, Forth would have been just as much an option for development as C at the time.

Joel Rees
  • 351
  • 2
  • 5
  • 2
    Welcome to Retrocomputing Stack Exchange. You were right to post this as an answer and not a comment as it answers the question - please read the [tour]. It doesn't matter how many answers a question has. – wizzwizz4 Jul 09 '17 at 17:27
2

To be realistic... in early 1981, you wouldn't be writing software for the announced but not released IBM PC, because it was expected to be a proprietary box with a proprietary OS, excluding the (relative) wealth of CP/M software already available. IBM owned the big computing industry in 1981, and it's PC was expected to be like its big iron: closed, proprietary, and not conducive to the rough and tumble world of the personal computer of the late 70's. It wasn't expected to sell very well.

You'd be targeting either Apple II or CP/M, to get the broadest possible market for 1981.

And you wouldn't be writing in C in 1981. You'd be writing in assembler, BASIC, maybe FORTRAN, whatever you could get a compiler for. C for personal computers didn't become popular until the mid to late 1980's.

Actually, writing one's own C compiler was not outside the realm of possibility in those days. Since the language is easy to parse, and is fairly close to assembler, writing a native code compiler wouldn't be that difficult, especially if the alternative was writing in assembler.

tj1000
  • 169
  • 1
  • 1
    Welcome to Retrocomputing. Do you know when the announcement about the open standard was made? I expect it was probably close to the release date to prevent somebody else from beating them to it. – wizzwizz4 Jun 30 '17 at 05:36
  • 1
    You say "with a proprietary OS", but that's not how I remember it. As I understand it, Project Chess was expected to run CP/M, just like so many other popular microcomputers. DRM was the leading OS vendor at the time. The shock was when it ended up running a proprietary OS from a little-known vendor that was known only for languages. (CP/M was, of course, still available for the PC at launch, but it was much more expensive than PC-DOS, thanks to the contract with Microsoft.) – Cody Gray - on strike Jun 30 '17 at 10:39
  • 3
    That's how it was perceived when it was announced. The word on Killdall balking at IBM's NDA and lawyers, and Gates picking up the ball had leaked out and it was assumed that the PC would be like big IBM computers - as closed as possible. I was still in college in 1981, but in general, the IBM PC was viewed as The Evil Empire intruding on this new industry. Little did we know that IBM legitimized the PC to a lot of stodgy big iron types, and it would take off. Even IBM didn't think this - the original PC was planned for a 200,000 unit run. They ended up selling over four million. – tj1000 Jun 30 '17 at 13:34
  • 1
    From what I remember, the target OS for PC was MP/M from Digital research. MP/M turned a PC into 15 CP/M machines, each on a TTY (the 16th was the supervisor). Bill Gates apparently jumped in when DR were late on delivery and IBM did not want to wait. I also had a copy of 'Seattle DOS' for 8086 which I was told was the skeleton on which MS-DOS was written. – ChrisR Jun 30 '17 at 16:50
1

You could write a x86 backend for pcc (and a cross-assembler), or politely ask Whitesmiths Ltd. to deliver their x86 compiler earlier (InfoWorld reported on 29 September 1980 that it was at least 6 months away from release).

Or just write in assembler (Intel ASM86 running on a Intel MDS) -- that's how IBM has written PC BIOS, apparently -- http://www.os2museum.com/wp/the-ibm-pc-bios-and-intel-isis-ii/

sendmoreinfo
  • 672
  • 6
  • 15
  • 1
    The cited article, where Plauger from Whitesmiths is interviewed, just says that they are committed to releasing an 8080 compiler Real Soon Now™. Notice, not an x86 compiler! So you'd still have to do some kind of translation from the 8080 to the 8088/8086, as discussed in my answer and traal's answer. Certainly possible, but a bit more involved than simply waiting on Whitesmiths! – Cody Gray - on strike Jun 28 '17 at 14:50
  • 1
    "Whitesmiths Ltd. is working on a version of its C Compiler for the 8088/86, but it is at least three to six months away from release. " -- page 12, https://books.google.com/books?id=nT4EAAAAMBAJ&pg=PT11#v=onepage&q&f=false – sendmoreinfo Jun 28 '17 at 18:30
  • 1
    Ah, I found this interview via the Wikipedia article that you linked (I would have swore you linked that directly in your answer, but obviously not). I didn't see the page you quoted there in the comment. You might [edit] that link & quote into your answer! – Cody Gray - on strike Jun 28 '17 at 19:06
  • 2
    Oddly, I noticed just the other day that the February 1984 edition of Byte (i.e. the Macintosh one — https://archive.org/details/byte-magazine-1984-02 p.20) contains a letter from Supersoft complaining about an unfair review of their 1982 compiler, which was "the first C compiler for CP/M-86 and MS-DOS", with the rest of the letter implying they mean the first for each individually, not the first to support both. So maybe Whitesmiths' compiler never shipped or shipped late? Though it's equally likely that Supersoft were using a suitably tailored test for 'first', given the mild ambiguity. – Tommy Jul 06 '17 at 17:50
  • I remember using Whitesmith's C but that was in 1983: not as early as 1980. There was also the Lattice C compiler which later became MSC but that only came out around 82/83: not as early as 80/81. – cup Jul 22 '17 at 08:32