76

IBM released the IBM 5150 Technical Reference manual in August, 1981, and included in it the fully commented source code listing for the BIOS. I find this odd for two reasons:

  1. IBM must have realized that creating a legal "clone" PC would be simplified by having this source code.
  2. Other manufacturers (Commodore, Apple, for example) fully documented their systems for programmers without including firmware source code.

Since cloning the PC ultimately undermined IBM's personal computer business, this seems like a case of incompetence in handing your competitor something that makes it easier for them, while providing no benefits for IBM. Is there an alternate or documented explanation for IBM doing this, besides simple incompetence?

Brian H
  • 60,767
  • 20
  • 200
  • 362
  • 7
    Given the long and convoluted legal history of IBM mainframes and the status of non-IBM peripherals, it was probably a reflexive move on the part of IBM... – Jon Custer Aug 13 '19 at 13:50
  • 54
    Back then, it was really not unusual to publish proprietary firmware code as documentation, just like publishing full hardware schematics - some are detailed enough, even allow you to reproduce the entire system - but doesn't mean it's legal. They are mostly used for developing 3rd party software/hardware, repairs, or modding. Another classic example is HP45 pocket calculator, the complete ROM Listing is published in US patent 4001569 (http://pmonta.com/calculators/us-patent-4001569.pdf, see page 47). Reading the code really helps to understand scientific calculating techniques at that time. – 比尔盖子 Aug 13 '19 at 16:10
  • 3
    @比尔盖子 That should be an answer, I would answer that if not for your comment. I myself studied a lot the ROMs of ZX Spectrum, XT and AT to be able to write better software. – Rui F Ribeiro Aug 14 '19 at 09:05
  • Pretty sure their incompetent part came with their deal with Microsoft. – LarsTech Aug 15 '19 at 22:18
  • I didn't see any of the answers state the obvious. Nice interactive disassemblers existed. Along with motherboard specs and specs on the various chips, it would not have taken a talented assembly programmer long to produce a readable and comprehensible assembly-code output. Even if a PC license forbade disassembly, it was going to happen, repeatedly, anyway. IBM publishing the code themselves was simply accepting the inevitable, and making a few bucks off it. – RichF Aug 17 '19 at 21:10
  • 2
    The source was under copyright. This is the same mechanism that protects GNU licensed software today. – Thorbjørn Ravn Andersen Mar 10 '20 at 12:00
  • What was the state of US copyright law in 1981? Was it necessary to publish the text in order to have the text be copyrighted? – dave Mar 10 '20 at 12:04
  • IBM did not had a lot of hopes for this micro-computer thing. The market was there but it was absolutely not clear that microcomputer will dominate over mainframes and minicomputer (and they did that much later). – Peter Parker Mar 11 '20 at 08:34

8 Answers8

106

In the late '70s and very early 80s it was not unusual to make BIOS source code available. Apple did indeed do so; the full source listing starts at page 76 of the Apple II Reference Manual. Atari did the same in their Operating System Source Listing section of their Atari 400/800 Technical Reference Notes.¹ For CP/M machines, having the BIOS source was near essential if you wanted to add new hardware to the system that could be used by CP/M. So IBM's publication of their BIOS source code in Appendix A of the Technical Reference Manual was not unusual.

Nor does having BIOS source available simplify making a clone; in fact it may make it more complex. To avoid copyright issues, your clone's BIOS must not copy anything from the published source. This means that even if you independently come up with a similar solution for similar reasons, you may have to rewrite your solution not to be so similar to what the published source does because you could be accused of copying it from the original source. Having source code with comments increases the scope of work that can't be copied, as compared to just object code.

Nor was it incompetent not to foresee, before the release of the first IBM PC, that clones would eat IBM's lunch. Nobody at that point even could be sure that a single standard system for microcomputers would ever become a thing; it certainly hadn't happened up to that point. Even if someone had had such amazing precognitive abilities that they could have seen this, I'm not sure that IBM ended up making less from their fraction of a huge PC market than they would have made from the totality of a much smaller IBM PC market that they owned exclusively. The whole point of clones was that they were significantly cheaper, and if the market couldn't move to cheaper by cloning PCs, it would have moved to cheaper by using less-compatible hardware and moving the compatibility burden to software developers.


¹Of the early PC manufacturers, Commodore appears to be one of the very few that did not distribute their BIOS (which they called "KERNAL") source, though it later leaked out. Perhaps this was because it was somewhat more substantial than other early BIOS code, being a bit closer to an actual operating system.

cjs
  • 25,592
  • 2
  • 79
  • 179
  • That's a really good point about the "Compatibility burden" being something shared between hardware and software industries, with a constant push-pull over the sharing of the burden. – Brian H Aug 13 '19 at 14:18
  • 8
    Even today, for most microprocessors, particularly in the ARM implementation world, the processor manufacturer publishes extensive example code or HDK including bootstrap/bootloaders, baremetal IO drivers, sample applications, that inevitably proliferate if the platform takes off, this is done in order to encourage adoption. In 30 years someone might ask, why was android bootloader public? Because otherwise nothing gets done. – crasic Aug 14 '19 at 02:06
  • 9
    @crasic Perhaps true for microcontrollers, but in the higher-performance ARM SoC's, most documentation is available only with strict non-disclosure agreements, and most of the firmware is only a binary blob. – jpa Aug 14 '19 at 07:03
  • Also it took quite a while for clones to become 100% compatible. If the BIOS I/O had been faster this might not have been necessary. – Thorbjørn Ravn Andersen Aug 14 '19 at 10:26
  • 4
    There were clones for IBM mainframes,the Amdahl mainframes, but in that case, Amdahl didn't end up dominating the market, so perhaps IBM was thinking a similar thing would happen with PCs. However, early PC's used a collection of off the shelf components, so cloning would be easier. The big change in clone dominance occurred with the 386, with IBM pushing for its micro-channel based PS/2 systems, while a group of clone makers developed a cheaper EISA standard. – rcgldr Aug 15 '19 at 04:18
  • 3
    continuing - It was during this era that the top 20 PC companies accounted for only about 50% of the market, with the remainder being small operations building 386 AT clones with over the shelf components. This also coincided with Apple raising prices for Mac across the board in late 1989, followed by the 386 clones and Windows 3.0/3.1, reducing the Macs share of market from 25% down to about 5%. – rcgldr Aug 15 '19 at 04:19
  • @curt It's interesting that you mention the Apple II especially since they got burned on that one and very explicitly didn't do it for the Mac - original story here: https://www.folklore.org/StoryView.py?story=Stolen_From_Apple.txt – KlaymenDK Aug 16 '19 at 17:13
  • @rcgldr Do you have a source for a 25% market share for Macs? That sounds very high. I could believe Apple had 25% of the market at that time, but that would have included various members of the Apple II line (primarily e and gs) as well. – chepner Aug 16 '19 at 19:32
  • @chepner - I agree, seems very high. It was from an magazine article. I do recall the 25% number dropping down to 5% number. That may have been market share in terms of dollars and not units (the Macs cost more). Apple did raise prices in late 1989 across the board on existing products, making the color Macs relatively expensive, combined with relatively cheap 386 EISA clones, with Windows 3.0 in 1990, and Windows 3.1 in 1992, the Mac share declined rapidly. – rcgldr Aug 16 '19 at 21:06
  • Sounds plausible. My only real recollection was that the Mac LC was the first not-really-expensive entry in the Mac lineup (circa 1991), and I always assumed that Mac marketshare was always fairly small, staying relatively constant as DOS/Windows-based PCs gained share at the expense of other offerings from Apple, Commodore, and Atari (just to name the ones I was familiar with in America). – chepner Aug 16 '19 at 22:30
  • @KlaymenDK Apple was in no way, shape or form "burned" by the Franklin thing. Not only did they win the lawsuit, but Franklin didn't copy the source; they copied the binary image of the ROM, and the availability of the source made no difference to that. The source might have been useful if Apple hadn't also made the schematics available, but with the schematics, BIOS source availability made little difference to someone making a clone. – cjs Aug 16 '19 at 23:08
  • While Commodore did not make their KERNAL source public, third parties disassembled and reverse engineered it, and published it with extensive comments in book form. As far as I can recall, Commodore was not bothered by that (at least in case of the C-64) -- their control of MOS/CSG probably meant that no clone could undercut them and be 100% compatible. – Michael Graf Mar 10 '20 at 11:18
  • @MichaelGraf Disassemblies are not useful for clones. Clones generally don't need them anyway, since the clone manufacturer can simply copy the binary code (as many did for the Apple II), but the copyright owner can sue them to put a stop to that (as Apple invariably did). Note that fully compatible IBM PC clones became widespread only after Compaq and others did clean-room reimplementations of IBM's BIOS. – cjs Mar 10 '20 at 15:50
  • 1
    "it was not unusual to make BIOS source code available" => Not only that, but also IBM was used to shipping source code to customers for their mainframe and midrange products. It was only in February 1983 that IBM made their infamous "Object Code Only" announcement, that they were going to stop shipping the source code to their mainframe software products. So, releasing the source code to the IBM PC BIOS in 1981 was rather consistent with IBM's broader culture at the time, even if that culture would soon drastically change. – Simon Kissane Aug 18 '22 at 00:01
74

When other manufacturers attempted to copy the BIOS from the source listings, IBM sued them for copyright violation and won. Besides, even without the listings, anyone would have been able to dump and disassemble the BIOS. Publishing the source code made it harder to argue that the engineers hadn’t seen or used it.

What took IBM by surprise was the strategy of a company called Phoenix Technologies in 1984. They had one team of engineers look at the BIOS source and write a complete specification of what each part of it did, and a second team, who had never seen IBM’s copyrighted code, re-implement the BIOS from the spec in a “clean room.” This stood up in court and became the missing piece that allowed other companies to make 100% IBM-compatible PCs.

Davislor
  • 8,686
  • 1
  • 28
  • 34
  • 3
    Should be noted that this approach would have worked even without source code. IBM stumbled with the PC in the early days (remember the PC Jr.?) and users had a voracious appetite for compatible hardware. That meant the clones could hit scales that IBM simply wasn't able to handle, at least not with ramp up time, and the clean room work on the BIOS was justified commercially. Even without source code this equation would not have changed. – madscientist159 Aug 14 '19 at 07:13
  • 5
    Having the source code surely saved Phoenix at least a little effort reverse-engineering the software. – Davislor Aug 14 '19 at 07:22
  • 3
    A little effort, yes, but in those days your source tended to be assembler anyway. It's not even in the same order of magnitude of effort as, for instance, trying to reverse engineer a complex C++ program from just the binary. – madscientist159 Aug 14 '19 at 20:53
  • @madscientist159 Agreed. That’s what I was getting at when I said they’d have been able to disassemble the binary. Knowing the variable and label names still helps. – Davislor Aug 14 '19 at 22:20
  • 3
    @madscientist159 One of Ralf Brown’s books talks about obtaining an OEM MS-DOS developer’s package from Microsoft that included the .OBJ files. He wrote that it was just as good as having the source code: it gave him everything but the comments. Which were probably out-of-date and misleading anyway. – Davislor Aug 14 '19 at 22:22
  • 2
    @madscientist159 reverse engineering hand-crafted assembler can often be harder than doing it for compiled code (I've done plenty of both). Native assembler code is often far less modular, with the semantic intent behind chunks of code being harder to reason about than when you get lots of readily identifiable small functions. – Alnitak Aug 15 '19 at 11:16
34

Just because they released the source code didn't mean that copyright no longer applied. They didn't "open source it".

Having access to the source was as effective documentation on interoperating with the machine as anything was.

Back in the day, we had a stack of microfiche (I'd guess 100+ pages of fiche film) with (apparently) the source code to DECs VMS on it. It came with the system. Pages of assembly printouts.

Will Hartung
  • 12,276
  • 1
  • 27
  • 53
  • 8
    They did open source it, but they didn't license it as FOSS. Open source does not imply copyleft. – forest Aug 14 '19 at 05:55
  • The IBM BIOS source was published: any member of the public could buy a copy of the technical manual and read it. This was not true of VMS, as far as I'm aware; DEC carefully controlled and limited the distribution of VMS source via strict licensing. (No competitor would ever be able to get a license to that source, for example, at least not overtly.) – cjs Aug 14 '19 at 07:55
  • 12
    @forest No, they published the source. To be open source, the licence must give the distributee the right to use, modify and redistribute the software. If IBM were suing other companies for using their source code, it was not open source. – JeremyP Aug 14 '19 at 12:33
  • 1
    @JeremyP That's a common misconception about open source software. OSS != FOSS. – forest Aug 14 '19 at 21:55
  • 5
    @forest Nope. If you publish the source code but people cannot reuse it, it is not open source. https://en.wikipedia.org/wiki/The_Open_Source_Definition – JeremyP Aug 14 '19 at 22:21
  • 3
    @forest Even your link denies your definition of open source although it criticises the term because people (you included) wrongly think it means "you can look at the source code". – JeremyP Aug 14 '19 at 22:25
  • @JeremyP Nearly all open source software is free software, but there are exceptions. First, some open source licenses are too restrictive, so they do not qualify as free licenses. For example, “Open Watcom” is nonfree because its license does not allow making a modified version and using it privately. Fortunately, few programs use such licenses. – PC BIOS license is similar, as are other "open source" ones like MS-DOS. – forest Aug 14 '19 at 22:29
  • I guess you're right if using the OSI definition of Open Source, but I'm using the FSF definition that Richard Stallman came up with, which defines it as source code which is available to the public, but which does not (necessarily) have a copyleft license. This is why RMS prefers the term "libre". From Wikipedia, quoting the definition that I use, Richard Stallman argues the obvious meaning of term "open source" is that the source code is public/accessible for inspection, without necessarily any other rights granted. – forest Aug 14 '19 at 22:32
  • 2
    @forest the OSI definition is what almost everyone means when they use the term. It can essentially be assumed that when people use the term that they are referring to the OSI one rather than the not well known FSF version. – Qwertie Aug 15 '19 at 03:12
  • @Qwertie There's a big following around Stallman and his definitions, but I will concede and admit that I'm holding the minority opinion, even if it's a vocal minority in some software development cultures. – forest Aug 15 '19 at 03:13
  • 1
    I would just like to add that copyleft != FLOSS. Code released under a BSD-style license would be free software (libre) and open-source but not under a copyleft. – user7214865 Aug 15 '19 at 08:19
  • Will: Open Source depends on copyright, so your sentence makes no sense. @forest publishing nōn-Open Source source code is nowadays commonly called “shared source” after a Microsoft model doing so. – mirabilos Aug 16 '19 at 18:09
  • @mirabilos Open Source depends on license. Copyright is copyright. The licensing determines the "open source-ness" of it. – Will Hartung Aug 16 '19 at 19:51
  • @WillHartung you don’t seem to understand how copyright and licences belong together (a licence is a permission to do things with a copyrighted work that is otherwise not permitted due to copyright law’s protection). – mirabilos Aug 17 '19 at 01:49
  • @mirabilos Shared source is usually more specific to limited-distribution source code. – forest Aug 17 '19 at 06:03
  • Well, it’s limited distribution, not Open Source. (And the distribution where it was coined from was from Microsoft to the general public, but under a restrictive licence that’s not OSS, so IBM’s publication is even more limited than that.) – mirabilos Aug 17 '19 at 14:19
  • @forest Why use rms's definition when you have the correct one? – JeremyP Aug 19 '19 at 16:49
  • I have also heard the term "source-available" referring to software whose source code is visible to the public but not under a license that is considered "open source" by the OSI (basically, a more universal and unambiguous term for RMS's definition of "open source"). – user7214865 Aug 21 '19 at 15:56
34

Harry Potter and the Half-Blood Prince is another work whose source code is entirely available for public view, yet it's definitely under copyright and you will get in big trouble for commercially copying it.

Making it public and releasing it from copyright are two separate things.

In order to clone the IBM BIOS, they had to write totally new software that did the same things, but prove in court that the writers of this software did not lift or copy any part of the original.

This process is covered in detail in the first few episodes of Halt and Catch Fire.

Thorbjørn Ravn Andersen
  • 2,262
  • 1
  • 14
  • 25
11

When IBM published this source code, it was to make it easier for other companies to make peripherals.

They wanted there to be a lot of cards to could be slotted into the PC and just work. Having all these cards available would increase the marked for the PC itself.

Stig Hemmer
  • 383
  • 2
  • 7
10

I can assure you it was not altruism! In fact, it made it darn hard to copy legally (see Eagle Computer) and get away with it. Phoenix had to prove that none of the people who wrote the code EVER read the IBM published code. And I can tell you that nearly all of us in that era had. So while it seems simple, it was a great way to freeze the competition. They had to wait until there were enough assembler programmers on the 808x series who resisted temptation. Copyright law specifies about 60% commonality I think. So it was harder than it looked.

Eagle did rewrite theirs under duress, and while it was a darn good copy, the rest of the industry was already moving on to Phoenix. IBM pulled a subtle, well executed strategy. Sadly they blew it in the MCA debacle, but hey, all of the companies at that time made a few of those!

Toby Speight
  • 1,611
  • 14
  • 31
billDickens
  • 101
  • 2
7

You are working from the rather popular assumption that hogging a brook will be preferable to tapping a river. The comparatively open nature of the IBM PC internals created a market that would not have existed otherwise, and IBM's profits from that market exceeded their expectations.

Now make no mistake: this was not particularly unprecedented. CP/M had example BIOS code available, my own NASCOM II came with a complete assembly listing of its system ROM, and I have the same for an Atari 400 here in an official Atari binder. It was quite customary for home computers to have schematics and system ROM listings readily available, and the market was sort of split here. In comparison, it was a bit unusual that the typical Microsoft ROM Basic that was also included with a number of home computers never came with the assembly listings or even API documentations and was a black box.

  • 3
    One of my prized possessions when I was, you know, quite a bit younger, was a book that was a commented disassembly listing of Microsoft BASIC for the TRS-80. As I recall, it was missing the opcode for each line, so they wouldn't break copyright, but that was easy to fill in by hand. I learned so much from that book. I thought it was this one: http://www.classiccmp.org/cpmarchives/trs80/mirrors/pilot.ucdavis.edu/davidk/documentation/62-1001.htm, but this is a little different than the one I had – Mohair Aug 14 '19 at 22:06
  • "popular" assumption? How on earth does one "hog" a brook (or even tap a river? wouldn't your knuckles get wet?) ?? – Mawg says reinstate Monica Aug 15 '19 at 15:17
  • @Mawg you hog a brook by diverting all of its water to irrigate your field. – Jeremy Friesner Aug 16 '19 at 16:42
  • One lives & learns. T o think that I had to read a retro computing forum to learn that :-) – Mawg says reinstate Monica Aug 18 '19 at 08:33
0

I find the position of this decision as incompetence rather interesting. This is presuming that every business is only, ever, out to make money and/or monopolize the market. I realise this is common place in a capitalist system, but don't like to assume that it is prevalent EVERYWHERE. Perhaps they just provided the source code to be nice, or to advance personal computing faster than they as a company could do themselves.

user66001
  • 109
  • 1
  • 1
    Altruism is a poor survival strategy for individuals. It is a great survival strategy for groups of individuals. Take for example the teams of searchers who go looking for lost children. Many times searchers are lost during these kinds of rescue operations. This is a poor survival strategy for the people doing the searching (yet is is wired into our brains (well, most of us)). However, it is great for the tribe because the children carry the tribe forward through time, increasing the tribes survival. – Jonathan Fite Aug 14 '19 at 13:27
  • 3
    Not "to be nice" - that is not the IBM way. Not "to advance personal computing faster" - the original IBM PC was not a leading edge product. Pure and simple capitalism - provide the information necessary for 3rd parties to write software and build hardware compatible with the IBM PC and IBM sells more PCs. That was a big change from the mainframe era and it worked. – manassehkatz-Moving 2 Codidact Aug 14 '19 at 14:28
  • 1
    I agree with manassehkatz that was IBM's reasoning. After all, a little PC could never, every compete with their mainline Mainframe and Mini product lines!!! – kmarsh Aug 14 '19 at 19:48
  • 1
    @JonathanFite It's not necessarily altruism for altruism's sake. A rising tide lifts all boats. – forest Aug 15 '19 at 03:15
  • Altruism requires an altruistic motive, otherwise it's not altruism. And looking back on it and justifying the decision based on effects does not mean that's why they did it. But user66001's answer seemed more about the nature of altruism than this specific decision by IBM. – Jonathan Fite Aug 15 '19 at 12:32