16

(Note: by "object-compatible" I mean that the opcodes and their following operands are the same—the assembler produces the same output for equivalant assembler mnemonics. This of course excludes calling conventions, etc., because these are determined by the code one chooses to write.)

The Game Boy uses a Sharp LR35902 CPU (sometimes called a GB-Z80) that's usually said to be similar to an 8080. However, looking at some of the opcodes they don't seem compatible.

For example, on the 8080 and Z-80 opcode 3A loads the contents of the memory address specified by the next two bytes into the A register: LDA nnnn in 8080 assembler or LD A,nnnn in Z-80 assembler. On the GBDevWiki opcode 3A is listed as ldd A,(HL), loading into A the address pointed to by HL and then decrementing HL. (This indexed decrement addressing mode doesn't seem to exist at all on the 8080 or Z-80.) This seems confirmed by the WLA DX assembler opcode table, which lists it as LDD A,(HL) (with alternatives LD A,(HL-) and LD A,(HLD)).

The direct load on the LR35902 seems to be opcode FA (ld A,(nnnn)), and 16 cycles instead of 13); FA is a JMP instruction on the 8080 and Z-80.

Am I looking at things terribly wrong somehow, or is the LR35902 actually a rather different CPU from the 8080/Z-80 that just happens to share the 8080 register set? And perhaps; Z-80 assembly syntax: the freely available assemblers use this though I don't know if the Nintendo dev kits did.

If this is the case, it seems that I should study the LR35902 instruction set carefully before writing code for it because some of the very useful changes (indirect with postdecrement) wouldn't occur to an 8080 or Z-80 programmer.

cjs
  • 25,592
  • 2
  • 79
  • 179
  • Instead of trying to explain a difficult term, it may be more appropriate to switch to a less ambiguous one - like binary compatible in this case? – Raffzahn Jul 24 '19 at 16:17
  • 2
    @Raffzahn I don't think "object-compatible" is ambiguous, and I agree with Wilson that "binary-compatible" has a different meaning from what I described (i.e., that it covers calling conventions and APIs, as well) That said, though Wilson and I seem to think alike on this, Wikipedia doesn't seem to distinguish the two. – cjs Jul 24 '19 at 16:27
  • No doubt, we could talk about this in many ways, one would be the question if there's a mixup in the definition between binary compatible and ABI - the later does reflect a relation to a certain OS interface, the former not. But we can skip that, as the need to explain what object means here already points to it being misinterpreted easy, doesn't it? Maybe using some other term like opcode compatible, which would narrow it down to the binary representation of instructions, wouldn't it? – Raffzahn Jul 24 '19 at 18:30
  • 1
    They have differences that make the Z-80 and LR35902 incompatible in either direction. See this answer for details: https://stackoverflow.com/questions/52009005/could-anything-with-a-z80-processor-run-gameboy-games/52019949#52019949 – George Phillips Jul 24 '19 at 19:09
  • @GeorgePhillips That's a great help, especially the chart of opcode differences. – cjs Jul 24 '19 at 20:09
  • Is the LR35902 a derivative of the LH5801 from the PC-1500 pocket computer? – Patrick Schlüter May 04 '20 at 15:47
  • @PatrickSchlüter The LH5801 is another one altogether, yet another incompatible one. – Omar and Lorraine Jan 08 '21 at 15:24

2 Answers2

22

However, looking at some of the opcodes they don't seem compatible.

There's your answer.

The LR35902, the Z80 and the 8080 really are different CPUs. They are similar in many ways, such as the register set and much of the programming model.

The Z80 does not have the HL postdecrement addressing mode you're talking about, and some things the Z80 has the LR35902 doesn't, for example a second register set.

Am I looking at things terribly wrong somehow, or is the LR35902 actually a rather different CPU from the 8080/Z-80 that just happens to share the 8080 register set?

No, you are completely correct. It's a rather different CPU. The Z80 is almost completely backward compatible with the 8080, excepting some undocumented bits and bobs, but the LR35902 has some opcodes remapped, some bits added or removed. So it's a Z80 "inspired" CPU, not intended to be binary compatible with either!

Omar and Lorraine
  • 38,883
  • 14
  • 134
  • 274
5

Is the Game Boy Sharp LR35902 object-compatible with the 8080/Z-80?

Depends what you call object-compatible. If that's about some object format than it depends more about the toolchain you're using.

If the question is about one being a direct upward compatible of either on opcode level, then it's a rather not.

Am I looking at things terribly wrong somehow, or is the LR35902 actually a rather different CPU from the 8080/Z-80 that just happens to share the 8080 register set?

Yes ... and no. It is a CPU based on the 8080 design, much like the Z80 is. But unlike the Z80 it isn't fully compatible as many operations got dropped to make room for 'new' operations. Many of them look inspired by the Z80, but are rescheduled to improve performance.

One issue with extending the 8080 is that its opcode space is already quite full, leaving only a few code points to be used. To cram in all extension the Z80 added they had to use two byte opcodes (*1). Which did add a full fetch penalty - rendering many of them way less efficient than they could have been.

Nintendo (Sharp) avoided this by simply dropping many nice but lesser used instruction as well as Sign and Parity bits with their corresponding 12 test operations (JP/CALL/RET) to make room for what they though to be great extensions that needed to be fast. At the same time they also implemented some 'complex' opcodes which look a bit like Z80 two byte opcodes, even encoded as CBxxh like some Z80 instructions, but they are all fixed opcodes of two bytes with no additional parameters, executing in 8 cycles.

Some of the confusion about being a Z80 or alike may come from their usage of Z80 mnemonics for this 8080ish CPU. My guess would be that at that point of time programmers were more familiar with Zilog's codes than classic Intel mnemonics.

Bottom line, the Game Boy CPU is a true descent of an 8080 in the sense that it takes its basic recipe from it, as well as the Z80 by borrowing some of their instruction variations and icing it over with its own changes, like no I/O address space, High Page and auto increment, making it a superb cake of its own.


*1 - It doesn't matter how some documentation tries to make them look better by calling it escape codes or alike, for all practical purpose they are two byte opcodes, adding another fetch penalty to execution.

Omar and Lorraine
  • 38,883
  • 14
  • 134
  • 274
Raffzahn
  • 222,541
  • 22
  • 631
  • 918
  • 3
    It seems, at a quick glance, as if it might be more than just "dropping" instructions to make room for new ones; why would they keep LDA direct but just change the opcode for it? And I hadn't realized that they'd changed the flags so drastically! (3 flags dropped, 2 new ones added.) – cjs Jul 24 '19 at 13:39
  • 2
    "object compatibility" usually means "can run all the same object files", i.e. is binary compatible, excluding things like syscalls or timing differences. – Omar and Lorraine Jul 24 '19 at 13:41
  • @CurtJ.Sampson I can't see how they moved the opcode for MVI. It was and still is 3Eh. Flagwise they didnt drop 3/added two but 'only' dopped two flags (P,S) (thus 2x2x3 instructions removed) and added one new new flag (N), but no direct tests added. – Raffzahn Jul 24 '19 at 13:54
  • 2
    @Wilson you're right, the word is binary compatible, as object can have many different levels of meaning. In case of assembly it usually means the intermediate representation before linking into binary - that's why I was rather careful/asked. – Raffzahn Jul 24 '19 at 13:59
  • 3
    I usually understand binary compatibility includes things like syscalls and calling conventions which are obviously not important for this discussion. – Omar and Lorraine Jul 24 '19 at 14:01
  • @Wilson Yes. I guess if the environment is included, binary compatibility may include more parts than - like OS, but as well any I/O. Just here we're talking bare CPU, don't we? Otherwise the question would be superficial, as no device, but the GB, has the same HSI as the Game Boy. – Raffzahn Jul 24 '19 at 14:06
  • 2
    I didn't say they moved MVI (load immediate), I said they moved LDA (load direct, from 3A to FA). – cjs Jul 24 '19 at 14:15
  • Wilson's definition of "object-compatible" is the one I was using; I've updated the question to (I hope) clarify that. Sorry for not doing that at the start. – cjs Jul 24 '19 at 14:17
  • 1
    @CurtJ.Sampson Ah ja, I see. They might have found it more conveniant that way? After all, as soon as a CPU is no longer fully binary (upward) compatible - that was dropped by removing the flags - pulling over binary doesn't make much sense. It will need to be recompiled anyway - if only to see if there's some offending instruction hidden. A system like the GB is way more like an embedded system than a desktop computer. there is no dynamic loading and user side swapping of components. (GB) games are monolithic applications created dedicated for this environment. – Raffzahn Jul 24 '19 at 14:32
  • Yeah, my guess would be that once they'd abandoned object-compatibility in a non-backward-compatible way (by removing instructions that would otherwise normally be used, such as ones based on the flags you mentioned) they just re-assigned opcodes whenever that made things more convenient for them. – cjs Jul 24 '19 at 14:34
  • @CurtJ.Sampson Now since you're already talkative - especially about the details, it'll be curious what in above answer discourages you from liking it? – Raffzahn Jul 24 '19 at 14:36
  • 1
    Hmm. I think it doesn't really focus on what I was aiming at with the question, though the list of changes you imply is interesting. Perhaps you could add some more specific details about what they took out (the dropped flags and their test operations bit is great) and what they added, beyond the high-page stuff? – cjs Jul 24 '19 at 14:45
  • LOL. Above answer is going close along the way your question is written, isn't it - and I guess you're aware of the old maxim that communication is not what the sender encodes but the receiver decodes? Anyway, works, I got the expected reply. – Raffzahn Jul 24 '19 at 14:53
  • Indeed, I agree that if the receiver appears to be decoding something incorrectly blame must be shared between the encoder and decoder in some proportion, sometimes more towards the former. – cjs Jul 24 '19 at 16:16