Closed Vutshi closed 5 months ago
Thanks for the report. The blue bar you're seeing is a debug view of the hblank area - eventually I need to hide that unless you have the 'debug' aperture selected, but the game is setting up a screen geometry that MartyPC isn't happy about. It looks like it is set up for low resolution mode, but the clock divisor is left for high resolution mode.
I note that the game runs fine in DOS 3.3. Games typically set up video modes via BIOS calls, or directly via setting the CRTC registers. DOS usually doesn't have anything to do with it - unless DOS 4.0M has to have its own video routines for the task-switching functionality.
I have considered before that my logic for when to adjust the clock divisor may be incorrect. I attempted some changes to that logic to derive it from the programmed CRTC registers, but that broke more things than it fixed. I may have to do a bit more research there.
I also note that if you try it a few times in a row, occasionally it does run properly.
you can see the issue here when it is not working, the 'hires text' bit of the mode register is set:
the high res text bit will override the 'graphics' mode bit so we stay in hires text mode, with the full 14Mhz dot clock - but the CRTC registers have been programmed to only display half the screen width, which is why we get to see the hblank period debug stripe.
When the game is running properly the hires text bit is cleared.
This bit is what I use to determine the character clock.
I also note that if you try it a few times in a row, occasionally it does run properly.
Oh, I never managed to be lucky enough to run it properly in MS-DOS 4.0M on MartyPC. In MS-DOS 3.3 it always works well as you noticed already.
I also have another version of this game as an .EXE
file. The .COM
version included with the DOS image seems to be just a wrapped .EXE
with some minor changes (it supports another port for joystick). The point is that the .EXE
version works funny in this MS-DOS 4.0M even in QEMU (I don't know about real hardware in this case). So this DOS is doing something unusual.
Btw, I don't see External Registers
in Video Card Viewer. How do you turn it on?
Took another look at the schematic. The character clock is derived here:
either the LCLK (low resolution character clock, used in 40 col and graphics mode) or the HCLK (high resolution character clock) is selected based on the value of the HRES bit from the mode register. So that checks out. Correspondingly, either the 14Mhz or 7Mhz dot clock is selected from the same bit.
It doesn't surprise me that it might work on another emulator - QEMU is not known for slavish adherence to hardware quirks. Most emulators work out screen geometry from what is programmed into the CRTC registers - QEMU would see a low resolution mode, it will display a low resolution mode. It is not bothering to clock the CGA card tick for tick like MartyPC does.
But if you say this works on hardware, then that's puzzling. I'll need to try it on my 5150 and probe the output of the HIRES pin coming out of the mode register chip. If it's high, well, that's going to be puzzling, as it would conflict with the schematic. If its low, but high in MartyPC, there may be some other reason the mode register isn't being updated properly.
Btw, I don't see
External Registers
in Video Card Viewer. How do you turn it on?
I actually added it just now while debugging this. :D
But if you say this works on hardware, then that's puzzling.
Yes, it works on a clone of XT. I don't have a real 5150.
Technically, setting both the 'hires text' and 'graphics mode' bits is not a valid combination. We know what that does on the original IBM CGA card - but the behavior may not be the same on clones. A clone CGA could decide to have the graphics bit take priority - which in this case, would probably make it work.
@Vutshi Try it in 86Box, it's much more accurate to real old PC hardware than QEMU is. It's not quite as accurate as MartyPC is, but it should give a better idea of whether or not this is a MartyPC bug.
It works in 86box, but i don't know if they're handling that invalid bit combination in an accurate way.
this was one of the discoveries made during the development of Area 5150, and they used it to be able to control the overscan color in text mode
To keep Area 5150 compatibility by default, I can't change the default behavior, but I suppose I could add a 'clone' boolean flag to the CGA card configuration overlay, that would switch the priority of the hires text and graphics bits, maybe even expose it as checkbox when you're operating a CGA.
The other option is another CRTC-based hack that would try to ignore the hires text bit if a lowres mode is programmed into the CRTC registers, but I've been burned before trying to add logic that the card simply doesn't have. Custom video modes deviate from the defaults by quite a bit, so detecting what is a low res mode and what isn't based just the horizontal total register isn't as trivial as it sounds, and you risk breaking more than you fix.
I would say it is better for MartyPC to do what IBM XT does and don’t bother with the clones, there are too many of them.
Do you understand why the invalid bit combination affects only the game running in MS-DOS 4.0M?
The mode register is only set to the invalid state in MS-DOS 4.0M.
DOS 4.0M:
[2024-06-11T14:05:55Z DEBUG marty_core::devices::cga] Write to CGA mode register: 00000000
[2024-06-11T14:05:55Z DEBUG marty_core::devices::cga] Write to CGA mode register: 11001111
DOS 3.3:
[2024-06-11T14:07:44Z DEBUG marty_core::devices::cga] Write to CGA mode register: 00000000
[2024-06-11T14:07:44Z DEBUG marty_core::devices::cga] Write to CGA mode register: 00101010
MS-DOS 4.0M is strange indeed. I wonder why do they set this invalid configuration… Maybe it explains why my Book8088 was unable to even boot this DOS. It somehow doesn’t recognize it as bootable.
DOS 4.0M: [2024-06-11T14:05:55Z DEBUG marty_core::devices::cga] Write to CGA mode register: 00000000 [2024-06-11T14:05:55Z DEBUG marty_core::devices::cga] Write to CGA mode register: 11001111
DOS 3.3: [2024-06-11T14:07:44Z DEBUG marty_core::devices::cga] Write to CGA mode register: 00000000 [2024-06-11T14:07:44Z DEBUG marty_core::devices::cga] Write to CGA mode register: 00101010
I am curious about when the write of 11001111
to port 0x3D8
happens in DOS-4.0M. I monitor the byte value at 0x465
(CGA mode in the BIOS data area) and see the following:
During the start of the game, DOS-5.0 writes the value 0x2A
(00101010
) to byte 0x465
and to port 0x3d8
. After that, the game runs properly without altering the CGA mode byte 0x465
.
On the other hand, DOS-4.0M writes the value 0x01
to byte 0x465
and to port 0x3D8
. After that, the blue bar appears. Where does the value 11001111
come from, and why is another incorrect value, 0x01
, being written by a BIOS routine...
Furthermore, the game does a correct int 10h
call to switch the CGA mode in DOS-4.0M:
Then something goes wrong as shown in the previous comment.
what is the address of the INT 10h handler when it is called? (look at the IVT table viewer, entry 16) if it is not in the F000 segment it has been overridden
edit: it is
the original int 10h bios routine is at F000:F065. breakpoint on that, and we eventually arrive at it when AH==0 and AL==4 which should set up CGA 320x200 graphics mode. So far the mode register looks ok
so why is it getting trashed?
something very weird is going on. if i set a breakpoint on the CGA card's handle_mode_register() call, i never see that invalid write, and the game starts up normally
EDIT: Does this only happen under GLaBIOS??
Does this only happen under GLaBIOS??
I don’t know. I’ve never tried anything but the GlaBIOS.
I think it's a bug in GLaBIOS. Working with the author to confirm.
EDIT: confirmed and fixed. Basically, GLaBIOS would load the mode byte parameter from the wrong memory location if the int 10h vector was overridden due to a missing segment prefix on XLAT.
I see. It never even occurred to me that the BIOS might be faulty. I’ll keep that in mind if I encounter another bug.
Many mysteries are resolved at once. I checked the original IBM BIOS, and it works well with DOS-4.0M and Pac-Man. However, Sergey Kiselev's BIOS doesn't boot this DOS, which explains my Book8088's behavior.
Thanks to 640KB for the rapid turnaround, I've committed fixed GLaBIOS ROMS here: https://github.com/dbalsom/martypc/commit/b01561c192e1fd2f85c6cada9b5b6f2cf8411fdf
Thanks again for the report.
This is cool. Thank you. So now MartyPC has a newer GLaBIOS than the upstream ;)
Describe the bug There is a deviation of MartyPC from QEMU and real hardware (clone of XT) on this strange version of MS-DOS. I ran a simple game, and it does not start, only showing a blue bar on the screen.
To Reproduce Steps to reproduce the behavior:
PACKMAN.COM
Expected behavior In QEMU and on real hardware, one can play this Soviet clone of Pacman and even switch between different tasks using
Alt
. However, the OS is very buggy.Environment (please complete the following information):
Build info
Image MSDOS400M-DISK1+game.img.zip