Closed AtomicShroom closed 1 year ago
What's the monitor you are playing on? This would happen to me with the Sega Genesis core on my Samsung LCD because the refresh rate is just barely different than standard refresh rates (SNES is slightly faster than standard NTSC, Genesis is slightly slower). Some kind of bug with the Samsung in trying to handle it.
Footage was captured on my Samsung Q80R. If it's the TV, then why does the problem only start manifesting after about 3+ hours? Before that it runs silky smooth without a single frame skip.
Does it begin happening much sooner with vsync_adjust=2?
EDIT: Additionally, this thread is mentioned in the thread you linked, which might be relevant --> https://misterfpga.org/viewtopic.php?f=33&t=2874
Try to either turn on freesync in the mister.ini, or disable freesync manually on your TV to see if that fixes it for vsync_adjust=1.
EDIT2: Do you think this comment specifically is applicable to your scenario? --> https://misterfpga.org/viewtopic.php?p=36755#p36755 - because it seems like this is expected in SMW
With vsync_adjust=2 I get weird constant screen tearing near the bottom of the screen instantly. It's very distracting and basically unusable.
freesync is already disabled on my TV. I tried turning on freesync in mister.ini a when it released but couldn't get it to work no matter what setting I put my TV to.
Yes I'm aware that Mario World has slight stutters by default even on a CRT, but nothing like what you can see in my video.
Thanks for testing the other modes as a way to try and rule out some other kind of issue.
Try out the freesync options again, since it was first released it's been greatly improved I believe.
This is what I get when I use either auto-detect or force VRR, wether I set the TV FreeSync option to Basic or Ultimate:
Does the TV still have the stuttering issue if you turn the tv off and back on once it starts occurring?
Ugh I'm pulling my hair trying to make sense of this.
So after hours of testing, it would appear that my initial assumption, that it occurs after a certain period of time, was flawed from the beginning. I let MiSTer run for over 5 hours with the TV on and it didn't stutter at all. Then I changed to Live TV, came back to the MiSTer input: And bam. Stuttering.
Looks like it's the TV doing this after all. I'm guessing that when I first encoutered the issue, I hadn't realized I had changed to another input and came back to MiSTer. Then anytime I tested I would just let MiSTer run while I played something else or watched TV.
I'm guessing changing to another input changes the TV's refresh back to 60hz, but then when going back to MiSTer it doesn't re-establish sync with the weird SNES 60.1hz signal.
Strangely enough soft-rebooting MiSTer and relaunching the SNES core fixes the stuttering, so when MiSTer initializes a core it seems to send something to the TV to tell it what the refresh is? I'm wondering: Does a TV send any sort of signal when switching inputs to let the device know it's active again, or is it a one-way thing? And if it does, is this something that MiSTer could pick up on and do whatever it does when it inits a core's hz rate?
Refresh rate is not the thing can be sent as a data. TV just measure it itself and tries to sync if possible. Don't treat your TV as an ideal bug-free device. Issue still can be inside your TV, or specific settings. It can be fixable or not...
@AtomicShroom I noticed upon re-reading that you are using HDMI input 3. It could also be correlated to this, many TV's have different capabilities depending upon input number, usually input 1 is the best for any kind of gaming, and even on some TV's they make it so VRR only works on 1 or 2 out of 4 inputs, etc...
I can't find any indication that any of the HDMI ports on the TV have different specifications other than HDMI2 which has ARC. Guess I'll just live with using vsync_adjust=0. Thanks guys!
Using vsync_adjust=1 on HDMI, MiSTer will run flawlessly for the first ~3 hours, but then will start exhibiting buffer hiccups where it seems to mismatch whichever buffer it should be displaying every 10 seconds or so, effectively displaying older frames than latest.
See here from 0:02 to 0:04: https://youtu.be/13gPV8z7a5g (filmed in slo-mo at 240fps)
Seems like after some time, the buffer handling runs into some race condition which causes the incorrect buffer to be displayed. This seems oddly similar to an issue I had reported (and was thankfully fixed) in vsync_adjust=0: https://misterfpga.org/viewtopic.php?t=2687
Steps to repro: