6

Compiling takes computing power, and to a lesser extent, storage and memory. Back in the 70s and 80s personal computers weren't powerful enough to compile codes in high-level languages or if capable took a huge amount of time. Thus small, simple compilers sometimes were often more welcomed than sophisticated ones generating optimized binaries. Some companies had to install minis for their programmers instead of compiling on the targeted machines, and bootstrapping has always been an achievement for a new language and its compilers such as small-c, which first was refined on Unix, then approached part by part to bootstrapping.

Employers and students sent their cards and tapes to Batch processing mainframes to compile. People have cross-compiled since the 50s. Were there any cross-compiling services for the public instead of internal users, free or subscribed, that one could connect to via teletype, telex or modem, or that received letters and packages, so that codes written, checked and interpreted locally could be compiled and debugged on their much more powerful machine?

Glorfindel
  • 407
  • 1
  • 3
  • 16
Schezuk
  • 3,752
  • 1
  • 17
  • 40
  • 12
    I don't know that the premise of this question is truly valid; I remember several business systems written for the Commodore PET/CBM systems written in 6502 Assembler and CBM-BASIC; I also remember business systems written for the Apple II (pre-plus) in Apple Pascal/UCSD Pascal. Turbo Pascal was an early useful compiler for CP/M systems, and was also one of the early ports to the IBM-PC, and Microsoft had an assembler and compilers that ran on and for the PC for Pascal, FORTRAN, and COBOL, with C coming later. – Jeff Zeitlin Feb 15 '23 at 11:48
  • 1
    It's before my time, but I have the impression that with a reasonable amount of money and/or begging, one could get a dial-in shell account on some organization's minicomputer. You could then run your cross-assembler or cross-compiler or whatever you want. If they didn't already have it installed, you would ask the sysadmin politely to install it, or build it from source in your own account if you had enough disk quota. – Nate Eldredge Feb 15 '23 at 16:08
  • 3
    @JeffZeitlin: I think there is a somewhat earlier time period where it could have made sense, e.g. mid- to late 70s. A hobbyist might only be able to afford a small microprocessor and a couple KB of ROM, enough to control their toaster or whatever but not enough for a development environment. Though on the other hand, if you only have a couple KB of ROM, it's probably not too much work to just write the code on paper and assemble to machine code by hand. – Nate Eldredge Feb 15 '23 at 16:12
  • 1
    @NateEldredge - I'm not sure, even as late as the Apple II, how widespread dial-up services were, and I'm not convinced that "cross-compiling" was a Thing at that point - although I vaguely seem to remember that cross-assemblers weren't unknown (but not widespread). – Jeff Zeitlin Feb 15 '23 at 16:28
  • 1
    @JeffZeitlin: Cross assemblers were the original means via which code was developed for the 6502. While it might have been possible to bootstrap a 6502 monitor using a PROM, a bunch of switches, and a pulse generator, and then use that to bootstrap an assembler, programs like the Microsoft BASIC interpreters used in 6502-based machines by Commodore and Apple were assembled on a minicomputer. – supercat Feb 15 '23 at 17:02
  • 6
    question talks about "small simple compilers" vs "sophisticated ones generating optimized binaries". it should be pointed out that the "sophisticated ones" back then were not at all in the same ballpark as today's optimizing compilers - not even for mainframes or larger minis. "optimizing" technology was in its infancy. Common subexpression elimination, loop reduction/unrolling, stuff that today is totally basic. No link time code generation! No profile-guided optimizations! And those optimizations there were for generally for large numerical codes. Nothing you'd be doing on a micro. – davidbak Feb 15 '23 at 23:24
  • 2
    Anyone that had a real need for this, which was probably mostly games developers, used something like Andy Glaister's PDS in-house. https://retro-hardware.com/2019/05/29/programmers-development-system-pds-by-andy-glaister/ – Alan B Feb 16 '23 at 13:45
  • 2
    RE: "Back in the 70s and 80s personal computers weren't powerful enough to compile codes in high-level languages" Nope, not what happened. The real "problem" was that they were not considered appropriate targets for some languages (specifically COBOL). In fact though, compilers on PCs for virtually every other language (and yeah, there was probably even some COBOL compilers out there too). They just didn't sell very well. – RBarryYoung Feb 16 '23 at 19:54
  • @RBarryYoung - We've talked about Realia COBOL before on this site - duckduckgo search (in order to get comments) That was pretty successful in its target market. – davidbak Feb 16 '23 at 23:11
  • @RBarryYoung I heard that some compilers (not assemblers) just wouldn't fit in the memory or even one single floppy disk for Family Computers, and compiling a complex project on Personal Computers took about a hour if not hours. – Schezuk Feb 17 '23 at 01:08
  • @Schezuk - oh, well, that was true. The Microsoft C compiler has a pass one and pass two - each on its own floppy. If you only had two drives one held a diskette with your files and the other you'd swap the compiler disks in and out .. and in and out .. and in and out .. and so on and so forth. That is one major reason why Turbo Pascal (and later Turbo C) were so damn popular! – davidbak Feb 17 '23 at 01:50
  • 1
    I'll point you to https://xkcd.com/303; that was a real thing. Sometimes, often, you just had to wait. – Kevin McKenzie Feb 23 '23 at 17:29
  • @KevinMcKenzie - that comment isn't on topic for retrocomputing: it is still the case, on some projects ... Sigh. (Just kidding of course about the comment not being "on topic" - anything goes in comments until the moderators come around ...) – davidbak Feb 23 '23 at 17:31
  • @davidbak - I suspect that MS C being a two-pass compiler was ultimately inherited from their earlier Pascal compiler, which was also a two-pass compiler. The first pass converted source code into an intermediate code format; the second pass converted the intermediate into Intel OBJ format, and then you had to run the linker to generate an EXE file. The second pass was identical for MS Pascal, MS FORTRAN, and MS C; I remember one hacker of my acquaintance who actually used only one "pass 2" executable for all three languages, saving himself a not-insignificant amount of disk spance. – Jeff Zeitlin Feb 28 '23 at 12:28

5 Answers5

26

I too doubt the premise of the question, on several counts:

The notion of using a teletype for access to remote compiling services seems ill-advised, since punching the object module on paper tape (the only option on a teletype) would take hours.

Assuming you'd actually use the target computer to receive the object code (thus writing it to disk, not paper tape), the economics would seem to be against you. You'd need transmit time plus wait time (for your slot) plus compilation time plus receive time to be significantly less than local compilation time.

Bureau services were also not cheap. You could likely buy a few more fast micros for the annual bureau fees.

So my answer to the posed question is "no".

With respect to Employers and students sent their cards and tapes to Batch processing mainframes to compile. As a student - no we didn't. We sent our programs to be run. The fact that they had to be compiled before running was incidental. The nature of student programming tended to be 'after one successful run we are done with it', so keeping object code, even if possible, had little point.

dave
  • 35,301
  • 3
  • 80
  • 160
  • Loading object modules from tape was really a thing on very early micros. But then again, not exactly what someone did who had remote access to a large(r) machine. That was usually based on a way more integrated process. Beside that,yes. spot on. – Raffzahn Feb 15 '23 at 12:36
  • 11
    +1 for: "Bureau services were also not cheap." Such as there were they were aimed directly at enterprises. Not hobbyists, not students. If for no other reason than that they would have had no way to invoice you, credit cards (back then) being not nearly the universal currency facilitator they are now. – davidbak Feb 15 '23 at 17:02
  • 1
    "Home computers weren't powerful enough" is, quite bluntly, rubbish /particularly/ by the early 1980s when discs were beginning to be affordable. By the end of the 80s you could buy '386 and use e.g. Ada with a DOS extender. Earlier than the 80s... well, there's plenty of accounts of people ferrying card or tape to a mainframe for a nighttime run, and plenty of accounts of individuals or small businesses hiring minis: but the state of MODEM comms precluded submitting source and having the binary sent back to you. – Mark Morgan Lloyd Feb 17 '23 at 21:00
  • "punching the object module on paper tape (the only option on a teletype) would take hours" No, if you needed large quantities of output in whatever format (green bar printouts, tape, disk pack, cards, microfilm), the bureau would send it via courier, and you'd get it within hours, if so desired. Similarly, large inputs would be picked up. Often you'd have an overnight cycle, whatever was done by the end of the day was sent in, and delivered the next morning, basically the same as the "nightly" builds before continuous integration. – user71659 Feb 17 '23 at 21:25
  • Well, ok, but that's scarcely 'using a teletype'; my own access to an IBM 7094 was via the UK postal service. And I think it unlikely that the target micro, the one that is too underpowered to run the compiler (the premise of the question), will have a card reader, proper magtape drive, or removable disk pack drives. – dave Feb 17 '23 at 23:13
  • @MarkMorganLloyd - I think you're disagreeing with the premise of the question, not my answer? – dave Feb 17 '23 at 23:15
  • @MarkMorganLloyd - I recognize that compiler! Actually, you only needed a '286 for Alsys Ada. And a full-slze ISA slot (in a case with sufficient room) for the included 4MB memory card ... Oh! And to make this relevant to the OP's question: The Alsys Ada x86 compiler was developed on '286 machines (Zenith PC-AT clones, to be precise. With that 4MB memory card and a 40MB hard disk.) Although the base compiler that we ported to the x86 was developed (native) on a Bull minicomputer of which I've forgotten the model number. But it was never cross-compiled or remote-hosted. – davidbak Feb 17 '23 at 23:48
  • @another-dave Yes, sorry, I was trying to reinforce your answer. – Mark Morgan Lloyd Feb 18 '23 at 07:58
  • @davidbak Ah yes, Alsys. Weren't they the people who obtained a Winding-up Order for GEC due to an unpaid invoice? Never /ever/ ship something to a corporate simply because somebody says "I'd like one of those" without giving you an order number. – Mark Morgan Lloyd Feb 18 '23 at 08:02
15

TL;DR: Yes,

such services existed.

But

usage was quite limited in time and audience.

Those that existed may be rightfully considered exotic fringe cases for very special situations.


Back in the 70s and 80s

Now, that covers too wide a range of applications and use cases for a coherent answer.

personal computers weren't powerful enough to compile codes in high-level languages or if capable took huge amount of time.

Is that so? A 1 MHz 6502 can assemble its own BASIC in a few minutes, so nothing that can't be waited for. Once there were personal computers, compilation happened there - after all, the main advantage of personal computers were their immediate availability.

Some companies had to install minis for their programmers instead of compiling on the targeted machines,

This was rather for two facts:

  • Unified development environment, and more often than not
  • Target machines that had no OS or programming environment.

Employers and students sent their cards and tapes to Batch processing mainframes to compile.

Not really. Students usually did jobs on their institute's machines, not on some different one. Likewise, at the time real cards were still a thing in programming; there were no micros and not much cross development.

Usually the only time programming happened with cross-compilers on large remote systems was when new CPUs were introduced and the manufacturer did not have any development system already at hand.

So Intel for example offered cross assemblers for 4004 and 8008 before their ISIS systems became available. After that, development was intended to happen on here. Likewise when new CPUs, like 8048 or 8086 became available, Intel offered software packages to compile for these new models - often in combination with hardware adaptors, programmers and ICE probes.

Zilog in contrast just used Intel systems in the beginning before their own development systems were ready - the Z80 is just an extended 8080, isn't it? :))

Apart from that, cross compiling was only a thing for game systems, as these did not really provide a way to develop on target. An exception might be development for similar systems, like developing on a PET for VIC20.

Multi-platform game development was essentially the only area where cross-compilation was a big issue - and here it was always an internal one. Not just to secure the game from leaking, but, as mentioned, to provide a single development environment for all platforms.

Were there any cross-compiling services for the public instead of internal users, free or subscribed,

All of that does of course rely on your definition of "the public". In any case these services were rather limited and only viable for a sort time. A good example might be the (cross) assembler MOS offered in the beginning to their customers using a GE-based dial-in service. As so often when it's about 6502 material, Hans Otten has the documentation, here the 650X Cross Assembler Manual.

Noteworthy, and unlike in the question assumed, that assembler was not comfortable at all. It was a very primitive linear beast. This makes sense, considering that the step from "no assembler" to "some assembler" is much more supportive than "some assembler" to "comfortable assembler".

The dial-in solution was soon superseded with MOS' Resident Assembler developed for the MDT650 - the history of MOS Assemblers can be followed on Michael Steil's Pagetable.

Toby Speight
  • 1,611
  • 14
  • 31
Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • It's interesting about the cross-assembler dial-up service; I would never have supposed it to exist. Was the rationale more that the target system had no persistent storage for source code? – dave Feb 15 '23 at 13:03
  • Students usually did jobs on their institute's machines, not for some different one That depends. For university students, yes. For me as a high-school student, the 7094 was at a remote university. (Applying the word 'student' to high school might be an Americanism; in England we were 'pupils'). – dave Feb 15 '23 at 13:11
  • I find it somewhat interesting in retrospect that while some few computers were equipped to start and stop two cassette drives independently, this ability does not seem to have been substantially exploited in software development. If one were to build a cable to interface a second cassette drive to the VIC-20's user port, it would have been possible to design a Pascal compiler cartridge for the VIC-20 which could read a sequence of source files of arbitrary size from one datasette and produce on the other a tape that could be loaded and executed in a single pass, provided that... – supercat Feb 15 '23 at 15:59
  • ...the total size of the symbol data was small enough to fit in the VIC-20's 5K of RAM. If one had a number of short tapes, each with part of a source program, one could load the output tape drive with a blank tape, feed in all the source tapes one by one, and end up with a single executable tape, without needing a disk drive nor sufficient memory to hold everything at once. – supercat Feb 15 '23 at 16:03
  • @another-dave It's still 'their machine', not some other. The rational for offering a remote cross assembler isn't missing storage, but missing systems. It was an offer when the CPU got launched. What else to use if there are just some chips to be sold? – Raffzahn Feb 15 '23 at 17:22
  • A 1 MHZ 6502 can assemble its own BASIC in a few seconds. If yours can't then you need a better assembler. – d3jones Feb 16 '23 at 00:07
  • @d3jones Lucky for you - then again, experience tells that using a better assembler makes assembling slower, not faster, as it does a bit more than just replacing mnemonics by opcodes. Glad to be shown otherwise (waiting since 40 years...) – Raffzahn Feb 16 '23 at 00:12
  • I'm genuinely confused, just as d3jones. You said "A 1 MHz 6502 can assemble its own BASIC in a few minutes"... isn't it seconds? I reckon my misunderstanding is this: you type some lines of BASIC, then you type RUN. Isn't "assemble" what happens after that? In my 6502 and Z80 I remember typing some huge (for my teenager standards) BASIC programs and they took very small time to run. Maybe I'm misunderstanding what "assemble" means. – Gerardo Furtado Feb 16 '23 at 11:35
  • 1
    @GerardoFurtado Not really. That's tokenify and interpreting. 'Assemble its own basic means building the BASIC-Interpreter, like the 12 KiB of Applesoft from it's assembler sources. – Raffzahn Feb 16 '23 at 12:16
  • @Raffzahn What assembler did you use? LADS for the Commodore 64 was quite inefficient, and self-assembled from RAM to RAM in around 2 minutes. Back in 1987 I wrote my own assembler, about 6K of object code from 19K of source code, and it self-assembled in 12.5 seconds. Using hash tables, even on an 8-bit micro, makes a big difference! – d3jones Feb 17 '23 at 01:25
  • @d3jones MS' 8Ki BASIC (so less than C= BASIC) is 7k lines in 160 KiB source - taking your numbers that adds up to >1min. More to the point, what were the capabilities of your assembler? 12 seconds for a 20k source leaves less than 200 instructions per byte. Sounds like a rather simple mnemonic to code translator, not a serious assembler. Did it support procedural macros, structures, linkage formats, etc.? MS BASIC is a very simple source due its age. Anything newer will be way more complex. Last but not least: few minutes is a statement of high speed as remote operation will take more time. – Raffzahn Feb 17 '23 at 02:26
  • @d3jones For a serious 6502 Assembler, take for example a look at orca/M and it's capabilities. Assemblers like that are what is needed for professional projects. – Raffzahn Feb 17 '23 at 02:32
  • @d3jones Aren't we talking about HLL compilers? I don't think Macro assemblers can be as slow or big as the question asks even on something like a CP/M PC or a Commodore PET. – Schezuk Feb 17 '23 at 09:27
6

The main ambiguity here for me is "the public".

I have a microprocessor course book of 1978 (A.J. Dirksen's "Microprocessors"). A part of the book presents some ways of developing:

  • Time sharing
  • Using an in-house system (from PDP-11 to IBM 370)
  • Using a development system

Rodnay Zaks (""Programming the Z80, possibly also in "From chips to systems") adds the single board computer and the home computer (but he dismisses the software support of home computers).

Both authors speak about rented terminals, but none of them talks about the data communication cost. However, just the fact that it is mentioned in a US book and a European book is enough proof that such services did exist.

But the rent of time and a terminal was rather high. Dirksen talks about 100 NLG to 500 NLG, which translates into current prices from 150 EUR to 750 EUR, per hour. So, de facto not really accessible for "the public", only for companies which could justify the costs.

chthon
  • 700
  • 2
  • 6
  • 18
  • In some areas in the US, unlimited local calling was available. If someone managed to get a shell, communication costs sometimes might not have to be worried about. – Schezuk Feb 16 '23 at 00:26
  • 1
    If dialed-up to a bureau, you can bet you're paying for connect time, since you're tying up real equipment: a modem port in a finite number of modems. – dave Feb 16 '23 at 05:52
  • 1
    @Schezuk there’s telephone time, and then there’s machine time. Even with unlimited local calling, it cost a lot to rent time on the remote machine. – RonJohn Feb 16 '23 at 18:13
  • 1
    Another aspect was that anyone who called you would get a “busy signal.” Voice mail did not exist. Even when answering machines were invented, they could only pick up and take a message if the line were not already in use. Neither could the modem disconnect and let the call through. In practice, if you were going to use a modem at a time of day when people would be trying to call you, you needed to pay the phone company for a second line. – Davislor Feb 17 '23 at 18:37
3

To give you some idea of how things worked back then: In 1973, a kid who lived not far from where I do now sent a letter to a ’zine called The People’s Computer Company, which they printed on page 5 of their November issue, with a handwritten note, “Somebody help him out!”

Dear Sir(s),

I have recently moved from Corvallis, Oregon to Bellevue, Washington. In Corvallis I had access to a CDC 3300 and a Digital PDP 12. A friend of mine gave me some old copies of your newspaper.

I have not had any luck in finding a computer to use. So I would greatly appreciate it if you could send me a list containing the names & addresses (and possibly more information) of your subscribers in the Seattle, Bellevue area, to aid me in my “search”, Any and all efforts will be appreciated!

Thank you!

A Friendly Computer Freak,
Stuart A . Celarier
Age 13, Grade 8

And, in 1973, they published his full home address. Corvallis is a college town, the home of Oregon State University, which is presumably where he got access to computers.

Davislor
  • 8,686
  • 1
  • 28
  • 34
  • 2
    Nice, just where is the relation to cross compiling? AFAICS it's about access to any computer without specifying a use case, isn't it? (BW, interesting find. had to read every page :)) – Raffzahn Feb 15 '23 at 23:12
  • 1
    @Raffzahn Thats all Stuart Celerier said about it in his letter (although he still lives nearby, so I suppose I could try to get in touch and ask). So I don’t know if he ever used a cross-compiler when he was twelve years old and borrowing time on two mainframes in Corvallis, Oregon. However, the PDP-12 did provide “an ultra-powerful, general-purpose assembler, editor, and monitor for both PDP-8 and LlNC programming,” including a Fortran compiler. So he might have had access to that. – Davislor Feb 15 '23 at 23:27
  • 1
    Hmm. Many did not have any computer access (might be 99.999% of the wold's population at the time) and some were looking to get it. Local or remote. But I wouldn't see how doing so is related. – Raffzahn Feb 15 '23 at 23:51
  • 2
    @Raffzahn Respectfully, if it was that difficult to get access to a computer at all, I think that implies something about whether there were public cross-compilation services. The closest thing to that would be university time-sharing systems open to their students. – Davislor Feb 16 '23 at 00:01
  • @Raffzahn remote access is a sine qua non of remote cross-compilation. – RonJohn Feb 16 '23 at 18:16
  • @RonJohn I assume you mean 'condicio sine qua non' but in any case, such remote access alone will not produce cross compilation as a mandatory result. All it gives is access to a remote system - that is, unless you can prove that all remote systems have allways - or at least in a great majority of cases - been used for cross compilation. If not, it's not a relevant item. (Not to mention that most cross compilation setup was local, not remote). – Raffzahn Feb 16 '23 at 18:30
  • @Raffzahn no, I definitely meant sine qua non. You”ve got to have remote access before running a cross compiler on a remote system. – RonJohn Feb 16 '23 at 21:10
  • 1
    'sine qua non' non integrum quote quamquam . Accessus longinquus postulare potest, sed non semper ad compilationem. [Meus latinus vetus esse potest, sed adhuc ibi est] – Raffzahn Feb 16 '23 at 21:30
  • @Raffzahn Stop pinging me constantly with these obstreperous, irrelevant comments. Thank you. – Davislor Feb 16 '23 at 21:34
  • 1
    @Davislor not 'pinging' you in any way, so I guess I need to be sorry for the way the system is made? Also, it's exactly about the core point of your 'answer'. – Raffzahn Feb 16 '23 at 21:52
0

Depending on your definition of "public", Motorola provided development systems on your timesharing facility for companies who'd taken their training course on the 6800 family. This ad from Electronics, April 1976:

Motorola Training ad

(Image source: Motorola M6800 Training ad April 1976 - File:Motorola M6800 Training ad April 1976.jpg - Wikimedia Commons)

scruss
  • 21,585
  • 1
  • 45
  • 113