20

How were the 70s and 80s coin-op machines programmed? What tools did a programmer use?

Nowadays, a programmer can use a PC with an IDE to program, test, place breakpoints. But in the 80s, how could a programmer test the graphics, the algorithms and so on?

TonyM
  • 3,704
  • 1
  • 19
  • 31
bassaidai0
  • 319
  • 2
  • 5
  • 4
    Do you mean arcade games, such as Space Invaders, Asteroids, Pac Man, etc.? – Fred Larson Feb 28 '20 at 17:20
  • 2
    Yes, I do exactly this. – bassaidai0 Feb 28 '20 at 17:26
  • I imagine the answer is much the same as for any small essentially headless system, say a network router, protocol gateway, or similar. We did our programming on a timesharing system. With luck it had a related OS and similar hardware, so you could do some testing in comfort. Then you loaded your code into the target hardware (network, serial cable, floppy disk) and tested it. There was some facility (serial port, network virtual terminal, remote debugger) to interact with the code. – dave Feb 28 '20 at 19:26
  • 15
    In reality, not so different from the way mainframe software was written and debugged in the 1970s. The most efficient way to find bugs was to read the source code. Since you probably wrote the source code with a pencil on paper, you had more thinking time than typing the first idea that came into your head into an IDE. If the program crashed, you got a printed memory dump - several hundred pages of hexadecimal numbers. You learned how to find the information you needed from that, and nothing else. With two or three test runs per 24 hours, there was time to *think,* not experiment! – alephzero Feb 28 '20 at 19:27
  • 11
    +1 for the idea that we reviewed our own code. Personally, after I felt like the code was all done and tested, then I'd take it home on green-stripe lineprinter paper, spread it out on the floor after dinner, and thoroughly read it. Inevitably there were improvements that could be made. – dave Feb 28 '20 at 19:30
  • 5
    I want to make an embittered comment about emacs and 80-column source code limits. But I suspect it wouldn't contribute much. – Tommy Feb 28 '20 at 22:09
  • 1
    @alephzero: ...and we liked it that way! We *LOVED IT*!!! :-) (Full disclosure: my first IDE was SPF :-) – Bob Jarvis - Слава Україні Feb 29 '20 at 17:22
  • Nothing beats a printout for programming. Way more versatile than any number of windows and overlays on a screen. – Raffzahn Feb 29 '20 at 17:35
  • My dad does research in Physics, I've seen some computer labs in the late 70s, both in France and in the US. It looked really magical for a kid to see these perforated cards, magnetic tape spools and monochrome terminals. I remember that they had a dedicated room with the card reading machines, and a dedicated printer room! Today I realize how low the productivity must have been dealing with that stuff :) – Thomas Feb 29 '20 at 19:27
  • 1
    @alephzero and others - "desk check" - a skill so lost even the name for it is gone – davidbak Mar 01 '20 at 19:58
  • As far as you know, there are some videos where you can see the arcade developers of the 70s / 80s at work? Tnx – Giorgio Brugnone Jan 09 '21 at 17:12
  • You can look at MAME to see what hardware was available for the different machines. – Thorbjørn Ravn Andersen Aug 31 '22 at 15:57

4 Answers4

21

The assembly (or, more rarely, compilation) was generally done on a minicomputer, such as a VAX 11. The tools were often written in-house. They might have some sort of simulation software to help test some of the code, but in the end you'd use a PROM burner to burn your code on to EPROMs (EEPROMs were not widely available in the '80s) and plug them into a development board to see how well they worked.

Jed Margolin, a hardware engineer for Atari in the '80s, did a lot of work on several of the classic arcade games from that period. A while back he published his email archive from that period. Reading it will probably give you, as it did me, a lot of insight into the coin-op development process.

cjs
  • 25,592
  • 2
  • 79
  • 179
  • 3
    That email archive is really interesting! And very nostalgic for me to see VMS MAIL -style headers. – LAK Feb 28 '20 at 18:00
  • 7
    Not to mention the specific intro page at http://www.jmargolin.com/vmail/vmail.htm — "When I arrived in 1979, software for the games was cross-assembled on two DEC PDP-11/20 ... We had two computer operators who would take your marked-up listing, do the edits, and run the program. If it actually ran ... it would produce a listing and a paper tape. ... You then took the Paper Tape to your emulator ... Programmers could load the program from paper tape, run it, set breakpoints, and examine memory as well as write to it. It was all done in Hex code ..." – Tommy Feb 28 '20 at 23:20
  • VAX-11 for an arcade game? What are you, Atari?! – Maury Markowitz Aug 31 '22 at 11:10
  • Microsoft built almost all their early software on a PDP-10; they even had an 8080 emulator running on the PDP-10 for debugging. – Eric Brown Sep 02 '22 at 00:16
15

Of course, there were hundreds of different standup arcade games, utilizing different hardware, and different developers creating software for them using different tools. So, there's no "one-size-fits-all" answer to your question. Rather, there were some general-purpose low-level ways of doing development back then that were fairly universal.

First, early arcade games were programmed using machine language. The only "offline" tool that might have been used to free the developer from encoding opcodes by hand would be a simple assembler. It was not unusual for early processor vendors to provide a cross-assembler for their processor, and that could run on common mini or micro computers of the time. Therefore, editing of the initial assembler source, and automatic conversion to machine code, could be done on a more advanced computer than the processor used in the arcade game.

Second, the developer needed to be able to test their code. In some cases, they may have done that in a simulator while they were still working to finalize the custom hardware that would go into the arcade game. More likely, they tested their code on a prototype of the arcade game hardware. This meant that hardware and software were being developed and debugged simultaneously, which is hard. Thus, it was common practice for arcade game vendors to settle on a specific hardware platform, and then use that same hardware for as many arcade games as possible. This was common practice at least as far back as Pac-Man in 1980.

Getting the machine code onto the hardware for testing could be a minor challenge. In some cases, EPROMs might need to be burned and plugged into the arcade board. More likely, the prototype development hardware would include a mechanism to download the target code using something simple like an RS-232 or JTAG port connected to a development computer. It's not that different, at least in concept, to how embedded software development is still done today.

In most cases, besides having extra development aids like RS-232 ports, prototype hardware could also include special software on ROM to provide a monitor program. This would allow the developer to do things like insert breakpoints in the code and examine memory in order to track down bugs. If the prototype was more advanced, then an In-circuit emulator (ICE) may be available to simplify the debugging process by relying on additional hardware besides just the target CPU abilities. But even simple 8-bit CPUs of the time, like the MOS 6502, supported break point (BRK) instructions that could invoke a resident monitor program.

All in all, the process was not very different than the bootstrapping of any software onto any new hardware that was done for other computers of the time. And, to a pretty large extent, the same techniques are still employed today. It's just that off-the-shelf components are much more available and advanced now than in the early days. This frees modern developers from having to reinvent the wheel on such low-level tooling for compiling, assembling, downloading code, testing, and debugging.

Raffzahn
  • 222,541
  • 22
  • 631
  • 918
Brian H
  • 60,767
  • 20
  • 200
  • 362
  • 2
    "It was not unusual for early processor vendors to provide a cross-assembler for their processor, and that could run on common mini or micro computers of the time." Same as nowadays, except that it's now a cross-compiler instead or as well. – TonyM Feb 29 '20 at 08:58
  • 8
    Admittedly I was not working on video game development back in the first half of the 1980s but on other embedded applications. There was no JTAG programmability for the hardware, FLASH did not exist. Programs were stored in UV-EPROM or we used special hardware with RAM in place of the EPROM to allow loading a built image for testing or we used an ICE plugged in place of the processor. This could be halted and variables examined much like the JTAG debuggers used today. – uɐɪ Mar 02 '20 at 13:45
  • 8
    A popular device was called a "ROMulator". It had a small board that was designed to plug into a ROM socket, and contained a RAM along with some switching circuitry that would connect the address and data buses to either the attached board or a ribbon cable which plugged into another board that contained a CPU of its own along with a small amount of RAM and ROM for use by that CPU, and a serial port to connect external hardware. This device would be used like an EPROM programmer, except that the memory being programmed would be the one in the little board sitting in the ROM socket. – supercat Jan 09 '21 at 21:09
6

How were the 70's and 80's coin-op programmed?

Not much different from today. But most definitely in a more hands-on manner.

Of course, all of this depended quite a lot on the company (size), target platform and most of all the time you're asking. Development changed extremely over just a few years from the mid 1970s to the mid 1980s. Where the very first developers had to use hand coding and/or mini/mainframe computers for cross-development and guesswork for debugging, the tool landscape developed fast into quite sophisticated hardware and software tools during the 80s.

What tools did a programmer use? Nowadays a programmer can use a PC, an IDE to program, test, use breakpoint...

Much the same back then.

Around 1975 to the mid-1980s a dedicated development system would be used like a PC today. These development systems were specialized modular systems like Intel's Intellec MDS (running the ISIS development environment) or Motorola's Exorciser. These systems provided integrated development. Maybe not as point and click as today, but incredibly cool for back then.

Testing didn't happen on virtual hardware, as hardware simply wasn't fast enough to do so, but using the 'real thing' and hardware debugging tools. An arcade board would be hooked up as a slave system using an In Circuit Emulator (ICE). The development could stop at any condition within the (hardware) emulated CPU as well as external conditions. This was complemented by ROM-simulation hardware.

but in the 80s?

Now, that's a big jump. In general the 80s continued this, but now standard PC-type machines replaced CPU vendor specific machines, with the added help of specialized developer versions of arcade boards. The same happened BTW for console development, except here the development hardware was made available to external developers as well.

How could a programmer test the graphics, the algorithms and so on?

Usually by loading ROM emulation units, plugged into the arcade board, from the development system and running the board as before. Tests happened in real time on the real hardware - with the ability to insert hardware breakpoints from the development system, or changing ROM content on the fly.


Hardware based development was used way into the 2000s until either standard hardware became fast enough for real time emulation or, more importantly, console systems (like X-Box or PS4) became more like PCs thus enabling the same development cycle as for PC-Software.

Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • Given how much emphasis is put on the Exorciser using a "TTY" (as distingushed from a CRT terminal) interface, I'm having difficulty believing that the interface was anywhere near even the CP/M version of Turbo Pascal 1.0, much less a an IDE. – cjs Feb 28 '20 at 18:44
  • @cjs And so it is hard from today's PoV to see Turbo pascal as integrated. After all there have been a few years of development inbetween, right? Compared to using a mini for assembling and doing everything else by hand, it's for sure high integration when tools are combined into one device with a consistent command line interface. In retrospect its not helpful to ask what a tool(set) can not do, but what it did offer in comparison at its time. Also, noone was stopped to use any (way more expensive) CRT terminal. Exorciser systems were as well offered with CRT support. – Raffzahn Feb 28 '20 at 18:52
  • Well, I do see Turbo Pascal as an "integrated development environment." Would you like to re-examine my comment given that? And do you have any thoughts on why Atari in the mid-80s continued to use multi-hundred-thousand-dollar VAXen as their primary coin-op development systems when the systems you discuss were available at one tenth the price? – cjs Feb 28 '20 at 19:18
  • @cjs Not sure what you want to have 'reexamined'. Did TP include hardware debuggers? Did it integrate cross compilation? Did it integrate handling of ROM emulation? For the Atari part, why should a company throw away a working system? That would be rather stupid, after crafting the tools they way they want it. In addition Further, Atari is not the only company that developed coin op - not even in the US. – Raffzahn Feb 28 '20 at 19:19
  • This seems contradictory: on the one hand you seem to be claiming that ROM emulation and hardware debuggers and whatever else were an advantage of the Exorciser et al. over cross-development using a VAX, and then you say that Atari was sensible to continue using their "non-integrated" VAX development system over the "integrated" systems you're talking about? – cjs Feb 28 '20 at 19:23
  • From what I've read, Atari 2600 games using the Arcadia/Starpath Supercharger were developed on the Apple II, whose cassette port was used to feed data to the SuperCharger. Many years later, I used the SuperCharger as a development system for testing out some Atari 2600 code; the loader can accept data pretty amazingly fast when driven by a computer or CD instead of a physical cassette. – supercat Feb 28 '20 at 19:32
  • @cjs Why? Do you always spend the money to buy each new tool, and spend a multiple thereof to train your staff for the new tools, when you already have a working process? Company decisions have to take more into account. Also, Atari's development wasn't just about the VAX, they did as well use specialized development-systems for implementation/debugging. For example Rockwell 6500 ICE systems. The question asked is not what one specific company used, but what the average development process for (microprocessor based) coin op in the 70s and 80s was - right? – Raffzahn Feb 28 '20 at 19:35
  • @supercat IIRC the Starpath came rather late (82/83?) and is about a console, not coin op. But yes, you're right, it opened 2600 development for a new range of (smaller) developers using small microcomputers and running on a small budget. I never had one, though, it would have been cool. – Raffzahn Feb 28 '20 at 19:40
  • Sigh. I guess you weren't aware that the Exorciser was introduced a couple of years earlier than the VAX, and many years before the first VAX was bought for Atari coin-op. – cjs Feb 28 '20 at 19:52
  • And this is relevant how? MOS, as a very small company didn't have the resources to offer development systems, like Motorola or Intel did. Heck, they didn't even have the money to buy a state of the art chip test system. So they did setup a development process using a GE timesharing system (Peddle came from GE) with TTYs and tools like paper tape fed programmers (I still have a letter with the login). This was the environment MOS presented at WESCON 1975 and Atari used to develop. The move to VAX systems happened when timeshare went away and they didn't want to invest in a redesigned process. – Raffzahn Feb 28 '20 at 20:11
  • 3
    Here is a quote from Owen Rubin, developer of Cannonball, a 1976 Atari arcade game. "I wrote my first (non-vector) game, Cannon Ball, while sitting in my small office at a Model 33 teletype connected to a Motorola MicBug 6800 processor, both of which were connected to simple videogame hardware. I hand-assembled the entire program--it was only 2K, but still took several months--including self-test, saving the code on punched paper tape."

    Atari only used 6800 for a short time in the mid-70s, before adopting the 6502.

    – Memblers Mar 01 '20 at 14:13
  • Raffzahn - w.r.t. the use of ICE or dedicated development boxes - weren't those really expensive? (Or did the great expense come only later with the use of faster/wider processors?) I'm wondering if game companies did in fact use ICE. I kind of thought they were mostly used for hardware design where they were totally necessary and software people used simpler (but more tedious) techniques. – davidbak Aug 31 '22 at 20:49
  • @davidbak To start with, we're talking coin-op here, which is where hardware and game development is the same company, but more important, it's professional development, were hardware is usually the least cost component compared to paying staff. Last but not least, back in the early days, there were no general purpose high performance PC systems that could emulate some game hardware. Whoever wanted to develop had to have some micro computer system, so buying an ISIS (or alike) was the way to go. – Raffzahn Aug 31 '22 at 20:54
  • @Raffzahn: Another essential thing to note about coin op games is that many games had hardware platforms that were designed at least in part around the games to be implemented thereon. Someone implementing a game like Frogger (I'd guess the actual game was done this way but I'm not sure) would mostly use a tile-based graphics subsystem, but with its "vertical" smooth-scrolling register replaced by a 16x8 dual-port RAM, thus dividing the screen into vertical strips that could be scrolled separately. A game with a vertical scrolling starfield that plays no roll in actual gameplay might... – supercat Sep 08 '22 at 17:21
  • ...include a dedicated linear feedback shift register whose period is one scan line or one pixel (depending upon screen orientation) longer than a frame, and superimpose a gray dot on the output video whenever a certain set of bits are all zeroes. Relatively cheap circuitry to enhance a game's aesthetic without requiring a bitmap frame buffer. – supercat Sep 08 '22 at 17:24
1

Here is an alternative answer to the same question, focusing less on the tools than the actual coding process.

You have to remember that the machines in question had seriously limited processing power, and even simple games like Space Invaders often required some rather clever programming to get the to work in the number of cycles available.

The best example of how difficult this was is in this video of the way Tempest worked. Because it was a vector system, the time needed to draw a line on the screen depended on how far the beam had to move. To reduce the time, the figures were designed to be compact and consist as much as possible of a series of lines end-to-end. This is why one of the characters is a spiral, it's a bunch of straight lines connected together. The hardware included a simple system for changing the scale.

The game was then written as a series of instructions stored in the video memory. This included commands to move the beam, draw a vector, and change the color or scale. Additionally, it included instructions for jumping to other locations in the list, and doing call/return, which allowed the vectors to be broken up into "subroutines".

The video ROM consisted of a series of hand-draw shapes, manually converted into a series of vectors, and then broken up by examining the resulting list for any redundancies, or even adding some if the resulting shape didn't change too much from the original design, and then placing those parts in subroutines.

The game engine running on the 6502 CPU included a small engine which copied these shapes into video RAM as required, and then looped over them while also handling collision testing and scorekeeping and similar tasks. However, the actual drawing was handled by custom hardware that looped over the "display list" independent of the CPU, going back to the start when it saw the HALT instruction. So the programmers were writing two programs, the game engine and the vector engine's custom code. All of this was in assembler, on another machine (a PDP-11 IIRC).

Curiously, and often forgotten in modern documents, the Apple II used a similar system consisting of a series of bit-fiddled bytes that described individual one-pixel vectors to the same end. The original ROMs did not include any form of bitmap manipulation, in spite of being a bitmap machine. The speaks to the sort of memory limitations people faced at the time, as the vectors were packed three-to-a-byte, a significant savings.

Maury Markowitz
  • 19,803
  • 1
  • 47
  • 138