17

It's an open question whether desktops would've kept using 5.25" until the end of the floppy era, but laptops meant something smaller was going to be introduced; that much was essentially predetermined. The contingent historical fact was the adoption of the particular 3.5" format we all remember, in preference to the many others that were contenders at the time.

I was reading this Wikipedia page just now and came across this:

"In the early 1980s, a number of manufacturers introduced smaller floppy drives and media in various formats. A consortium of 21 companies eventually settled on a ​3 1⁄2-inch floppy disk (actually 90 mm wide) a.k.a. Micro diskette, Micro disk, or Micro floppy, similar to a Sony design but improved to support both single-sided and double-sided media, with formatted capacities generally of 360 KB and 720 KB respectively."

So the way Wikipedia puts it, sounds like the decision was basically made by committee. Twenty-one companies got together, carried out a sober evaluation of all the contenders based on technical merit, manufacturing cost, which influential members already had a large investment in what, etc, then issued a verdict and so it was done.

My understanding had been a bit different. As I understood it from e.g. here the big breakthrough for the 90 mm format that ended up winning, was getting into the Macintosh, for which Apple helped Sony debug the drives (their own Twiggy drives developed for the Lisa, never having become reliable enough). I assumed this was the reason they started being used in PC compatible laptops, which settled the matter.

If that version of the history is correct, the outcome was determined not so much by a grand deliberate decision from all interested parties, as by a few particular events, decisions made by a handful of individuals who were trying to solve their own short-term problems; a historical accident, chaos at work in the technical sense of the word.

Which version is accurate?

Kaz
  • 8,086
  • 2
  • 38
  • 81
rwallace
  • 60,953
  • 17
  • 229
  • 552
  • 15
    Speaking as a user of 8", then 5.25", and lastly 3.5" floppies, I'd suggest "comes in a rigid cover" was a significant factor. – dave Jan 18 '19 at 23:37
  • 4
    yup what a blast from the past @another-dave, i used to use 5.25 ones in school to load dos and remember having to baby them on strict instructions from the teacher not to damage them. then 3.5 rigids came into vogue and i would fling them across the schoolyard to give a class project to someone else – Karan Harsh Wardhan Jan 19 '19 at 06:34
  • 2
    @another-dave And the winner of floppy disks is ..... not floppy. What a twist of fate. – aktivb Jan 19 '19 at 12:02
  • 1
    Since I'm a pedantic programmer, I'd point out that (a) if you're judging a disk by its cover, it's not a "disk" either, and (b) "floppy" of course describes the disk, nothing else. But since I also like a low pun, I don't disavow the term "stiffy". – dave Jan 19 '19 at 16:48
  • 5
    A real LOL answer. Remember the days of pocket protectors and slide rules. Didn't think so. A 3.5" floppy will fit in a dress shirt breast pocket. It was a practical reason. Ever heard of sneakernet – Guy Coder Jan 19 '19 at 16:56
  • 4
    I recall Steve Jobs being interviewed and related an engineer trying to sway Jobs into using the 3.5". When Steve asked, "Give me a really good reason...". The engineer took it and placed it in his shirt pocket. Steve then said "Ok, go with that!". – jwzumwalt Jan 24 '19 at 01:45
  • 2
    @jwzumwalt That story has been told in a dozend variations with many different people mentioned. Also, the Question ins unique, as it asks why the 3.5" took over, while the other question assumes (falsely) why there was no development past 1440 KiB in Diskettes. Different issues around the same media. – Raffzahn Jan 26 '19 at 04:09
  • @Raffzahn - The difference is I saw the interview and heard Steve say it. That does not mean it happened that way - it just means that was the "true" story as he related it "straight from the horses mouth". – jwzumwalt Jan 26 '19 at 17:23
  • 1
    @jwzumwalt I remember him. He was undoubtful a good sales man. But even if true, it's only a story toward Apples use of Sony's format, only marginally related to the success of the 3.5" disk. Isn't it? – Raffzahn Jan 26 '19 at 18:12
  • 3
    I don't believe this question should be marked as duplicate. This is a question specific to the 3.5" disk drive. The other question is specific to the capacity, not the physical drive itself. – Jason Hutchinson Jan 28 '19 at 21:00

4 Answers4

25

So the way Wikipedia puts it, sounds like the decision was basically made by committee.

And that's what it was - and what made it succeed. A standardized disk format with a drive interface compatible with existing controllers.

As I understood it, the big breakthrough for the 90 mm format that ended up winning, was getting into the Macintosh [...]

Not really. For one, Apple used a Sony drive from before the standardization mentioned. While the mechanical and media part was the same, the drive did differ in its interface and operation, thus requiring dedicated controllers.

I assumed this was the reason they started being used in PC compatible laptops, which settled the matter.

More or less. There were eventually 3 major steps and some in-between development marking this process, with IBM's use of 3.5 inch drives in their PS/2 line as the final milestone.

  • In 1980, Sony developed the 3.5 inch format. Only a few computers like the HP-160 or Sony's SBC-70 used that drive.

  • In 1982, the 3.5 drive as we know it got defined by a joint committee. The approach followed was to use Sony's mechanical and media design, but use an interface compatible (*1) to the existing Shugart standard for 8 and 5.25 inch drives. Only the connector was turned from PCB into a pin header for size reduction. This had the advantage that all needed was a new cable to operate a 3.5 drive on existing 5.25/8 inch controllers.

  • 1983 brought the first drives to this standard, offering 360 KiB (single sided) or 720 KiB (double sided) when operated with standard MFM controllers. Beside many small machines, a first batch of drives for MSX computers opened a door in the consumer market.

  • Eventually the first PC(-ish) computer to use 3.5 inch drives was the Apricot PC in 1983.

  • 1983/84 was when Apple adopted a drive, based on the Sony design, but incompatible with the standard, for their Mac. The deviation was to increase capacity and reliability at the same time. While it worked great, its impact on the floppy marked could be ignored as Macs didn't gain much of a market share and the drive itself wasn't sold to other manufacturers.

  • 1985 saw Atari and Commodore adapting standard-compatible drives for their new 16-bit machines. Around the same time, 3.5 inch also established itself as the standard format for MSX computers in Japan and Europe (*2). In combination, these home machines created a huge user base lowering cost of drives and media at and below existing 5.25 inch drives.

  • 1987 saw IBM introduce their PS/2 line with 1440 KiB 3.5 inch drives (doubled as HD) as standard. Even though PS/2 sales were, lets say, less than optimal, PC manufacturers rushed to embrace the 'new' format to show their advancement.

Shortly thereafter (1988 or 1989, depending on source) sales of 3.5 inch drives surpassed 5.25 sales ... and the rest is history.

  • oh, and then there was ED (2880 KiB) in 1990, but that only caught on in Japan, despite IBM offering some PS/2 with ED drives.

*1 - Here hides the true secret, compatibility. It already worked well, enabling the move from 8 to 5.25 inch. At the time the 3.5 was designed, many new drive variations between 2 and 4 inch were developed (IBM's Demidrive being a good example). Most had their own 'way' improved interfaces. None got a large distribution - except those using a Shugart compatible interface. The 3 inch is a great example that worked.

*2 - MSX2 made the 3.5 inch drive standard. 5.25 were still supported, but all manufacturers switched to 3.5 for their new machines.

Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • I believe you are talking about the Atari ST and the Amiga. Those were 32 bit machines, using the M68000. Technically you can say they were "16/32" because most (but not all) of the registers were only 16-bits wide, but if you are going to simplify it to a single number, that number should be 32. – T.E.D. Jan 18 '19 at 17:52
  • 1
    @T.E.D. Oops? Well, I guess you should build a time machine and tell this to Motorola, as they advertized it as "[F]ully implemented 16-bit microprocessor with 32-bit registers" - for example in their 1983 datasheet titled MC68000 16-BIT MICROPROCESSOR. I think it's fine to go with what it was called at its time by its manufacturer. Don't you think so? – Raffzahn Jan 18 '19 at 18:32
  • 1
    Your time machine idea might actually be easier than getting all the Wikipedia edits calling it a 32-bit processor to take. But I'd still need it to fix a rather lot of books as well. – T.E.D. Jan 18 '19 at 18:43
  • 4
    @T.E.D. Well, you might be onto something. The internet is of course always right. BTW, I found a site explaining in detail why the earth is flat. I bet Wiki is following soon. :)) But serious, one of the important parts when citing wiki is always double checking information found - and as shown, Motorola did provide a different view. So who's right, genuine documentation of 1983 or a Wiki entry made 30 years later by 'some people on the internet'? Beside, the time we fought to upscale our machines by attributing it as 32 bit or whatsoever is long past. Time to get serious again, Isn't it? – Raffzahn Jan 18 '19 at 18:53
  • 1
    On a side note, when your machine is working, don't be so suprised that everyone back then called them 16 bit machines as well. It was an unpretentious, plain time ... at least it seams like when looking back :)) After all, being 32 Bit was one of the USPs for the 68020 in 1984 - which in turn was used by Motorola starting around 1985 to rebrand 68000/010/012 as 16/32 bit and the 68008 as 8/32 bit (before that it was marketed as 8/16 bit). It's real fun to flip thru old datasheets :)) – Raffzahn Jan 18 '19 at 19:01
  • 1
    Thing is, I used these machines, and programmed them. I know how they are classified. Heck, I've helped build a 68K emulator. But I'll tell you what, if you can manage to cure Wikipedia of its odd mass delusion with your 1 authoritative example from 1983, I'll quit gassing up my Delorean, and you will have saved this timeline. – T.E.D. Jan 18 '19 at 19:08
  • 1
    Honestly, rather than arguing about it, it would be probably more interesting to ask when the classification changed. I know of one other example calling them 16-bit CPUs back in 1987. But they are clearly today considered 32-bit architectures, and that's my memory from having worked with them as well. I am now kind of curious when that change happened. – T.E.D. Jan 18 '19 at 19:11
  • 1
    @T.E.D. Serious, who says I'm fine with this tie line? We should go ahead and create one where Commodore and Atari did use 8086 instead :)) And no, I'm not fighting any Wiki fights. Too many kids out there who think they know better and way more time at hand. It's in fact one if the issues, where the freak wars of the 80s are still fought. And yes, I did program 68k as well - maybe before - I still got my DTACK board (go check :)). And no, to all measures the 68000 is a 16 bit chip, as it got an 16 bit data bus and a 16 bit ALU. - otherwise the Intel 8080 would be a 16 bit chip as well :)) – Raffzahn Jan 18 '19 at 19:26
  • 2
    Despite the committees and what not, there's no mention that the 3.5 design was simply, fundamentally, better. More durable, convenient, smaller, faster, higher capacity. Eventually going to the 2.8M models that the NeXTStation (among others) supported, but by then it was too little too late. – Will Hartung Jan 18 '19 at 19:28
  • 6
    Nobody back in the day called the 68000 a 32-bit processor. It was a 16-bit processor with a 24-bit address space, just as the 8080/6502 was an 8-bit processor with a 16-bit address space. The 68000 had 16 bit registers and could do native 16-bit math in one instruction, and that was that. And that was big at the time! – Harper - Reinstate Monica Jan 18 '19 at 22:58
  • 1
    NXP is nice to you, wants peace for all, and calls the M68000 today an 8/16/32-bit Microprocessor in their current manuals. – tofro Jan 19 '19 at 00:05
  • 1
    Sega went as far as to emblazon '16-bit' onto the case of their 68000 machine, if we're looking for citable sources that are easy to verify as to contemporaneous measures. – Tommy Jan 19 '19 at 00:31
  • 7
    @Harper -- no, even the lowly 68000 had 32-bit address and data registers, and I'm 99.999% sure it could do 32-bit addition and subtraction in a single opcode. I was a high school student with an Amiga in 1987, and nobody I knew would have EVER called the 68000 a "16-bit microprocessor" back then (8/16/32-bit, MAYBE... but psychologically, it was unambiguously 32-bit). – Bitbang3r Jan 19 '19 at 02:31
  • 1
    Just one more reference for this silly m68k side-talk (first of a series written by one of the guys at Motorola at the time): http://www.easy68k.com/paulrsm/doc/dpbm68k1.htm – Greg A. Woods Jan 19 '19 at 02:36
  • 1
    I believe NeXT machines had ED drives. Not that any average person could afford to buy a NeXT. – VGR Jan 19 '19 at 04:47
  • 2
    @Bitbang3r Mind to just check the original documentation? Also, if doing a certain sized operation with one opcode, then Z80 and 8080 and 6800 are 16 bit CPUs, as they as well can do 16 bit operations in a single opcode. And address was 24 Bit, not 32 - which did lead to compatibility issues when 68012 and 68020 introduced a larger addreses. I love silly talk :)) – Raffzahn Jan 19 '19 at 07:58
  • 1.44 MiB diskette drives were not standard on all IBM PS/2 (e.g; the model 30). IBM also used a quad-density 2.88 MiB drive on later versions of their 3174 cluster controllers. – grahamj42 Jan 19 '19 at 09:43
  • 2
    @Raffzahn, I have long been of the opinion that the 68000 was a 32-bit processor with a 16-bit data bus and an instruction set based on 16-bit words (like the 8088 was a 16-bit processor with 8-bit data bus and byte-based instruction set). However, I dug out my copy of Motorola's M68000 Programmer's Reference Manual and find you are right that Motorola describes it as a 16/32 bit and not 32/16 (no doubt I was influenced by the name of the Fortune 32:16 system of 1982). – grahamj42 Jan 19 '19 at 10:10
  • 3
    The mechanical robustness of the 3.5" diskette was also important in displacing the 5.25". It also had a write-protect slider, whereas the 5.25" involved sticky tabs and and with the 8" floppies, you had to cut a slot in the jacket to write-protect them! – grahamj42 Jan 19 '19 at 10:16
  • @grahamj42 Cutting a hole to write protect a disk is still way saner than having to cut a hole to be able to write to it at all. – Mr Lister Jan 19 '19 at 12:25
  • @MrLister On 5.25" floppies, the notch was cut at the factory. The idea of write-protecting floppies came after the 8" design was standardised. – grahamj42 Jan 19 '19 at 12:33
  • @Raffzahn, I think you're talking about the practice of using the upper 8 bits of address registers to store unrelated values (the register HELD 32-bit values, but only USED the lower 24 bits for addressing). Look at it this way... everyone would call an Intel i7-7800X a "64-bit" CPU because its general-purpose registers are 64-bit (the same way a 68000's were 32), but it ALSO has assembly language instructions to aggregate them into 128, 256, or 512-bit super-registers and use them directly as such, the same way a 6502/6510 had limited 16-bit functionality despite being "8-bit". – Bitbang3r Jan 20 '19 at 01:54
  • 2
    @Bitbang3r Well, retroactive a lot can be guessed - fact is that Motorola did market it as a 16 bit CPU - and for all serious (and historic) purpose we should stay by the fact. Right? – Raffzahn Jan 20 '19 at 02:09
  • 1
    @Raffzahn -- After some consideration, I think I came up with something we can both agree with: the 68000 did most of its processing and i/o in 16-bit chunks... but presented PROGRAMMERS with the convenient, neat, and tidy facade of a virtual 32-bit processor that automatically did most of the tedious, ugly 16-bit gymnastics & grunt work for them. To programmers, the 68000 WAS a 32-bit processor, and programming it was a fundamentally different experience from programming a "naked" 16-bit CPU like the 8086 that made you jump through hoops to deal with 32-bit data and address > 64k chunks. – Bitbang3r Jan 20 '19 at 05:18
  • 1
    @Bitbang3r Well, ofc one can't argue with an opinion based PoV. From a historical point it's what Motorola called it, 16 bit. By a technological view it's as well 16 bit, as bus and ALU are 16 bit. Further presents any definition based on unrelated items (like other CPUs) a weak logic, when a definition can be soly based on inherent features. Even doing so, there is a great and accepted example of a CPU with 32 bit registers (plain) but 24 bit address and 16 or 32 bit implementation: the /360. (BTW: I never had an issue with the 8086 addressing - it's rather impressive for a clean 16 bit CPU) – Raffzahn Jan 20 '19 at 12:27
  • The 6502 has an 8-bit memory bus, 8-bit internal buses, and an 8-bit ALU; while it supports automatic carry propagation between the lower and upper halves of an address, it has no mechanism to retain the computed value after using it as an address. The Z80 has an 8-bit memory bus and 16-bit internal bus, but 4-bit ALU; it includes a mix of 8-bit and 16-bit instructions but is referred to as an 8-bit processor because many 16-bit instructions take more than twice as long as their 8-bit counterparts, and there are essentially no 4-bit instructions. – supercat Dec 11 '20 at 21:18
  • 2
    The 68000 can be fairly described as a 16-bit implementation of a 32-bit architecture. It does have a 32-bit internal bus, but as with the Z80 its ALU is smaller than its internal bus size (16 bits). Unlike the Z80, the 68000 has instructions to operate on ALU-sized quantities or even smaller ones, and operations on bus-sized quantities are somewhat slower than operations on ALU-sized quantities, but they generally take about half again as long as ALU-sized operations, rather than taking more than twice as long. – supercat Dec 11 '20 at 21:23
  • This rather ignores the rest of the story. Hitachi Matsushita and Maxell designed a 3" drive and began pushing it hard. The others got together in part to fight it and Sony won the format war – Alan Cox Mar 07 '23 at 22:44
  • @AlanCox There have been many other attempts, like IBM's 4", Matushita's 3", Mitsumis 2.8", Sharp's 2.5" and Fujitsu's 2" or Sony/Canon's (non related) 2" VideoFloppy. But the question is why the 3.5 won the way it did, not why others have lost out. For the 3.5 it was simply compatibility with existing systems (by changing Sony's custom interface to Shugart) offering low adaption cost, enabling slow adaption, thus growing gradual supply, lowering prices thus making it even more appealing.All shown by the increasedadaption. – Raffzahn Mar 07 '23 at 23:00
6

This is covered in one of the major Mac history works, although I can't recall specifically which one.

When Jobs was putting together his supplier list the 3.5 had been standardized, as Raff notes, but you still had lots of companies pushing their own formats. Machines with all of these could be found on the market.

Jobs went to Japan to visit with the various manufacturers to see where they were, I don't recall anything suggesting he had made up his mind on the format (other than "no 5.25" anyway).

The account notes that in some cases he would be presented with mock ups, and in one case a block of material that was indicative of the size and shape of the proposed device. Apparently he savaged them in these situations, with the book joking that they went away to commit hari-kari after these meetings.

Only Sony had an actual production-quality drive ready to go at the production numbers he demanded. His numbers proved overly optimistic, but the rest is history.

It seems the history is similar to USB in many ways. USB was going to happen sooner or later, but the iMac certainly helped jump-start the process.

Maury Markowitz
  • 19,803
  • 1
  • 47
  • 138
  • 1
    This folklore article covers the topic: https://www.folklore.org/StoryView.py?project=Macintosh&story=Hide_Under_This_Desk.txt – N.D.C. Jan 18 '19 at 18:47
0

The 3.5" floppy drive was first introduced to the market in 1983 with a single sided version with a 360K capacity. The following year double sided disks were introduced that doubled the capacity to 720K. Eventually the capacity was increased to 1.44M, and even a short lived 2.88M version. Even though these drives were available, it took a long time for them to be adopted as the standard.

The 5.25" floppy, which displaced the earlier 8" drives became the de facto standard for over a decade. Most major software companies such as Microsoft were shipping their software on the 5.25" floppy long after the 3.5" drives were available. Most IBMs, and IBM compatibles did not have a 3.5" drive until the early 1990s, and at that time, companies wanted to maintain backward compatibility with existing hardware. Since most software was still on 5.25", there was not much a need to have both drives, since a floppy drive at that time was an expensive option. Companies began the shift to the 3.5" floppy around 1990. They were eventually phased out by the mid 90s around the time the Pentium was released.

One of the first 3.5" floppies I owned was the game Rampage which actually shipped with the game on a 5.25" floppy and a 3.5" floppy. In fact MS-DOS 5.0 which was released in 1991 shipped on a 5.25" floppy!

Sure, Apple had a 3.5" floppy many years before. But that was not what really forced the change. Apple floppies had their own proprietary file format which were not compatible with a PC. They also had a very small market segment compared to the PC. What really helped drive it is the form factor of the PC changed. Original IBMs had a full size AT board, which was massive. In 1995, the ATX form factor was released, which was significantly smaller. By this time the 5.25" drive has been abandoned for the 3.5" drives because they took up significantly less space. The CD-ROM was also becoming a popular option, and was installed in the drive bay made originally designed for the 5.25" floppy. The 5.25" floppy was already in decline for a few years before, but the CD-ROM effectively killed it.

0

I'm going to answer the question from a practical user perspective, having been an active microcomputer user through this transition.

5.25" floppies have a few disadvantages:

  • Limited capacity. Yes, higher-density versions of these disks could store up to a megabyte or more, but were expensive and rare. 3.5" floppies stored 320-400K on SSDD media all the way up to 1.44 MB on DSHD media (and 2.88 MB on the never-very-popular ultra-high density version which I never used).
  • Fragility. 5.25" floppies were easy to damage because they were flexible. 3.5" floppies had a hard shell so you could throw a disk into a pocket or even mail a disk in an envelope without any great danger.
  • Jackets. Those diskette jackets are a nuisance. You can lose them and misplace them, and have to do something with them while your disk is in the drive. 3.5" disks have a built-in retractable metal shutter that provides superior protection plus the advantage of being able to be forgotten about.
  • Write-protection notches. 5.25" floppies have a notch that has to be covered by an adhesive strip to protect the media from being written to. 3.5" floppies have a plastic switch that can be toggled by the user very easily (no running out of strips) and can be untoggled just as easily (the strip could be removed from a 5.25" disk, but rarely was adhesive enough to be reused afterward).

I think the other reason 3.5" disks took off so much is their size. Computers like the Macintosh, Atari ST series and Amiga 1000 and 500 had built-in floppy drives and yet were fairly small machines. A 3.5" floppy drive stored more and took less physical space. This was less important on desktop PCs, but even there was a help as cases could get smaller or the space used for other devices.

Jim MacKenzie
  • 1,632
  • 1
  • 12
  • 31
  • 1
    Capacity is not related to media size ... if at all, it's the other way around with 5.25 being able to store (size relative) more on the same media type. – Raffzahn Jan 25 '19 at 20:45
  • 1
    @Raffzahn Of course, but I'm talking about the practical capacity, not the theoretical capacity. Standard densities of 3.5" disks store more than common densities of 5.25", despite being smaller-sized. – Jim MacKenzie Jan 25 '19 at 21:09
  • 1
    Just because the corresponding 5.25s never made big inroad. Well, except maybe Bernoullis with 10 Mib (and later 20 MiB) at a time when 3.5's where stuck at the same 720 as 5.25 did :)) Yes, Bernoullis are floppy disks. – Raffzahn Jan 25 '19 at 21:13
  • @Raffzahn Users don't care about what made inroads. They care about what is available to them. DSDD 3.5" stores more than double what DSDD 5.25" stores. DSHD 3.5" stores about 1/3 more than DSHD 5.25". Bernoullis were never mainstream. Of course bigger media can theoretically store more than smaller media - just use the same density, but that was never done for various reasons ,and users (except for you and fans of tech) don't care why. Harsh to downvote me for this. – Jim MacKenzie Jan 25 '19 at 21:17
  • This doesn't seem to answer the question. Many of the other 2-4" formats had similar advantages over 5.25" as the current 3.5" format, did they not? – cjs Apr 14 '20 at 08:35
  • @cjs Probably true, but none had the advantage of ever having been deployed on a system that actually had significant market penetration. I'm not sure what the first popular system supporting 3.5" disks was - possibly Macintosh, which didn't have any need to tie itself to 5.25" disks - but once one popular system supported 3.5", then the availability and hardware cost problems start to go away and the practicality arguments start to take over. DOS machines had no need to migrate, but the drives and disks were available and affordable and now their particular merits (see my post above) emerged. – Jim MacKenzie May 02 '20 at 16:48
  • I'm not sure I buy your explanation. For example, mentioning the Mac without mentioning the FM77 (released the same year) does seem to indicate a bit of an, "if it's not a US computer, it doesn't count" bias. Would you care to list the top ten selling computers using 3.5" drives at the end of 1983, worldwide (and not just in the US and perhaps the UK)? – cjs May 02 '20 at 20:51
  • You forgot to mention another feature of 3.5" disks: they were sized to fit commonplace men's shirt pockets, at least in the US. – supercat Dec 11 '20 at 21:24