libretro / RetroArch

Cross-platform, sophisticated frontend for the libretro API. Licensed GPLv3.
http://www.libretro.com
GNU General Public License v3.0
9.85k stars 1.78k forks source link

G-Sync support #1633

Closed nfp0 closed 5 years ago

nfp0 commented 9 years ago

Regarding the support of variable refresh rate systems like G-Sync or Freesync. Is RetroArch supposed to work correctly with them?

I have a G-Sync capable monitor, and from my tests, it seems RetroArch is not taking advantage of G-Sync appropriately.

I understand RetroArch utilizes Dynamic Rate Control to sync the video rendering to the reported monitor frequency. For this to work, say, on a 60Hz screen, a 60.01hz game is slowed down to 60hz, and games that run at significantly different fps cannot correctly take advantage of Dynamic Rate Control. With G-Sync, the emulator doesn't have to sync the video and a 60.01hz game will display at the exact 60.01hz on the monitor. A 50Hz game will display at 50Hz. A 75Hz game will display at 75Hz, etc. Now that variable refresh rate systems are a reality, why can't RetroArch leverage this advantage? For this to work, theoretically I would just need to disable V-Sync on the emulator settings and the driver would take care of syncronizing the monitor to the precise frequency that RetroArch is outputting, making for a perfect, non-stuttering, non-tearing, non-input-laggy experience.

I've tried many different configurations on RetroArch with the BSNES core, and I can't seem to get it to work. I always get video stuttering. With the monitor set at a maximum of 60Hz it seems to work correctly, but this nullifies the advantage of the 120Hz reduced input lag. Turning on the black frame insertion also solves the problem, but I don't want to use black frame insertion and that also introduces input lag.

I would greatly appreciate if anyone looked at this issue. Thanks.

--- Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/11382396-g-sync-support?utm_campaign=plugin&utm_content=tracker%2F296058&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F296058&utm_medium=issues&utm_source=github).
hizzlekizzle commented 9 years ago

I believe the way to do it is to disable vsync and make sure audio sync is on (that is, block on audio rather than video). You might try reducing the audio latency, as well.

inactive123 commented 9 years ago

Ask @ToadKing, he owns a G-Sync monitor and has used it successfully with RetroArch before.

He can explain to you which settings to enable/disable.

nfp0 commented 9 years ago

@hizzlekizzle That solution makes sense, but I've searched for the audio sync option and I can't find it anywhere.

@twinaphex Thanks, I've sent him an email.

inactive123 commented 9 years ago

@hizzlekizzle That solution makes sense, but I've searched for the audio sync option and I can't find it anywhere.

Go to Settings -> Audio Settings, it should be there. If it isn't, go to Settings -> Menu Settings and make sure 'Advanced settings' is set to 'ON'.

aliaspider commented 9 years ago

it is hidden by default I think, so enable advanced settings first : settings -> menu settings -> show advanced settings

nfp0 commented 9 years ago

@twinaphex Are you referring to the "Audio sync enable" option? If so, it was already enabled by default. Or is it another option?

Awakened0 commented 9 years ago

My 120hz monitor doesn't have GSync, but changing Vsync Swap Interval to 2 fixes the stutter at that refresh rate for me.

Edit: Oh, duh, I guess that wouldn't work if you're disabling Vsync to let Gsync take over. I dunno then :/

nfp0 commented 9 years ago

@Awakened0 I tried that before and it also works for me (only if I also enable Hard GPU Sync though), but I wanted to take advantage of the perfect uncompromised sync that G-Sync allows.

nfp0 commented 9 years ago

@Awakened0 No, the thing is: Just disabling VSync is precisely what should make it work. G-Sync is doing it's job because I don't get any screen tearing even with V-Sync off, but RetroArch doesn't seem to be outputting frames in a smooth manner. Both RetroArch and FRAPS report that it's outputting 60fps, which is correct. I'm clueless as to what might be missing and why the hell does it work fine when I enable black frame insertion.

Awakened0 commented 9 years ago

I mean that the swap interval setting only applies when Vsync is enabled in RetroArch. Maybe if it was possible to use swap interval 2 with Vsync off it would fix your stuttering while maintaining Gsync.

aliaspider commented 9 years ago

what problems are you having with vsync off and audio sync on ? stuttering ? you can try to use the limit maximum runspeed option in general settings to use a cpu timer instead, try with both audio sync on and off and see if that helps. there is also something else you can try which is to set both Audio Rate Control Delta and audio Maximum timing skew in audio settings to 0, this will prevent changing the audio rate reported by the core.

nfp0 commented 9 years ago

@Awakened0 Yes, it would be useful if swap interval 2 also worked with V-Sync off. That way we could duplicate frames independently of V-Sync being on or not. Useful for 120hz monitors.

@aliaspider The problem is stuttering yes. I've disabled "audio sync" and the game ran at a much faster speed, obviously. Activating the "limit maximum runspeed" restored the speed to normal levels, but the stuttering came back. Not as much, but it's still there. When I disable G-Sync and use "audio sync" + V-Sync + "hard gpu sync", it's 100% smooth, it never ever stutters. Disabling audio sync and enabling the cpu timer made all the sense, so why is there still some stuttering remaining? It should be perfect with that configuration you suggested. : /

EDIT: Oh my... @aliaspider It seems your method worked IF I also enable "Hard GPU Sync"! I don't know exactly what that option does, but it seems to solve the stuttering.

So, to any G-Sync user trying to get it to work: Disable V-Sync on the Video settings Enable Hard GPU Sync on the Video settings Enable "Limit Maximum Run Speed" on the General settings Disable "Audio Sync" on the Audio settings

Make sure you're running in true exclusive fullscreen mode or G-Sync will not function correctly. Running in windowed fullscreen will not work with G-Sync.

Thanks for the help, everyone!

aliaspider commented 9 years ago

cool ! did you also test cores running at 50fps? because that is the most interesting use case for g-sync, and you would also be able to see clearly if there is a difference to normal v-sync, assuming of course that g-sync does work as advertised.

nfp0 commented 9 years ago

I haven't tested it out yet, but I'm also curious about that! Care to recommend me a core and ROM for me to test out?

aliaspider commented 9 years ago

any core that supports PAL would work. or try nx-engine, it has an in-game option to switch between 50fps and 60fps.

nfp0 commented 9 years ago

Okay, I have partial success on the 50Hz front. With V-Sync we have the usual horrible micro-stuttering associated with playing a 50Hz game on a 60Hz screen (or 120Hz in this case). But as soon as I use the G-Sync method I posted above, the micro-stuttering is gone! G-Sync automatically sets the monitor for 50Hz refresh and the motion is mostly fluid, like on old PAL TVs.

I say mostly because another, more random, kind of stuttering was introduced, more akin to the inconstant stuttering I was experiencing before I was using "Hard GPU Sync" even on 60Hz games. It was bearable IMO, but it's strange that I can't achieve the perfect 100% smoothness I managed to get on the 60Hz games. Regardless, the improvement between V-Sync and G-Sync on the 50Hz games is night and day. The micro-stuttering is completely eliminated. My CPU is an i5 3570K overclocked to 4.5Ghz, so I think that's not the problem, even though I used bsnes's core. But maybe someone with different hardware should give it a try.

Another peculiar thing I found is that, when on a PAL game and on the settings for G-Sync I listed on my other post, if I use the "save state" option from the menu, the game accelerates to 60Hz instead of 50Hz, and the sound is gone. The emulator has to be restarted to get the normal behavior again. Weird...

ToadKing commented 9 years ago

G-Sync should work under the correct conditions.

nfp0 commented 9 years ago

@ToadKing I think you're wrong about G-Sync. V-Sync should be turned off. It's not possible to have screen tearing with G-Sync because only full frames are drawn. That's it's sole purpose, to eliminate screen tearing without the need to synchronize an application to the screen's Hz.

You can easily test this with any fullscreen application.

ToadKing commented 9 years ago

Actually I was reading some things wrong. V-Sync is just stubbed out when G-Sync is enabled so it shouldn't have any effect being on or off. Sorry about that.

nfp0 commented 9 years ago

Yes, usually there is no consequence of having V-Sync on with G-Sync, but in some cases, depending on the application, you might be introducing V-Sync related input lag, without any advantage. Therefore it is recommended to always disable V-Sync on all applications when using G-Sync, just to make sure.

Out of curiosity, did you need to enable "Hard GPU Sync" to make it smooth?

nfp0 commented 9 years ago

Sorry for double-post, but can anyone tell me why do I need "hard GPU sync" for G-Sync to work correctly and eliminate the random stutter?

What does "hard GPU sync" do exactly, and why leaving it disabled causes so much random stutter in my G-Sync setup described in my post above?

Awakened0 commented 9 years ago

That option is meant to reduce video latency (input lag) caused by video driver buffering. Maister describes the feature in the input latency section here: https://github.com/libretro/RetroArch/wiki/Getting-optimal-vsync-performance

So it's something you'd want on anyways as long as your CPU can handle it. I'm not sure why having it on would help with G-Sync. Seems like a nice side effect :)

nfp0 commented 9 years ago

Yeah, I've read that. I understand it reduces input lag, but it makes no sense that having it disabled causes big random stuttering in G-Sync.

Also, I'm worried about the partial success of G-Sync on 50Hz PAL games. The random stuttering occurs even with "hard GPU sync" on. I would like to call the developer's attention to this issue, considering G-Sync and FreeSync seem to be the holy grail solution for emulation with perfect sync, independently of frequency.

inactive123 commented 9 years ago

Did you really turn off rate control? Rate Control Delta needs to be set to 0.0 otherwise its not disabled and still active.

nfp0 commented 9 years ago

I assume you're talking about this option? retroarch audio options (I'm using dedicated fullscreen, I just windowed to take the screenshot)

Indeed I didn't have it on 0.000. I thought I just had to disable "Audio sync enable" and it would be enough. But now I've also set "Audio rate control delta" to 0.000 and the same issue persists. 50Hz has a stutter every 2 or so seconds, but it's not constant. It's random.

Another question: Is the "Refresh rate" setting on the video settings relevant in my situation where I have V-Sync and audio sync off?

inactive123 commented 9 years ago

Yes, that's the setting I mean.

In any case, I don't have any G-Sync monitor, only ToadKing does. RetroArch certainly wasn't made with it in mind back when the basic A/V syncing model was being written.

I dunno if @Themaister can still respond to technical questions like this but he'd be your best bet.

nfp0 commented 9 years ago

@twinaphex Of course. That is understandable. Variable refresh rate monitors weren't even on the horizon. But theoretically, G-Sync does not need support from the application level. It's application agnostic. Any dedicated fullscreen application takes advantage of the variable refresh rate.

What I find weird is the fact that it works perfectly at 60Hz and stutters a lot at 50Hz on RetroArch (not the expected 50Hz microstutter, mind you). Also weird is the fact that I needed to activate "Hard GPU Sync" for it to be smooth.

I'm not even planning on playing any 50Hz games, but I love this project and I want to report and test what I can, since there seems to be little people using RetroArch with variable frame rate monitors yet, which are of special relevance for the emulation scene.

inactive123 commented 9 years ago

This is still Windows-only, right?

You tried messing around with some of the settings in the nvidia driver control panel?

I'll try asking around and see if some engineers can tell me anything substantive about any issues they might be experiencing here. I am reluctant to add a whole bunch of hacks to our basic syncing model just for G-Sync though.

nfp0 commented 9 years ago

Of course, I agree. This ought to be well implemented, not some side hack. Keep in mind even though variable framerate is still in it's infancy, it is already a VESA standard and more and more monitors are going to support it with both NVidia and AMD already embracing it one way or another. And as I said, this is of special importance for emulators, where we have to make compromises, like programming special A/V synchronization algorithms in the emulator to cope with our crippled fixed 60Hz or 120Hz monitors.

And no, this is also on Linux already, fortunately. : ) http://www.blurbusters.com/gsync-support-arrives-in-new-linux-drivers/

Googer commented 9 years ago

Personally I think properly-working variable sync support is of even greater interest for arcade game emulation for perfectly smooth and accurate low-lag synchronization, given so many machines have funky refresh rates (Mortal Kombat using something in the vicinity of 53 Hz as I recall as an example). Hopefully the hardware supporting this becomes mainstream sooner rather than later so we don't need to shell out stupid money for specialized gaming monitors just to get support for this. :p

ToadKing commented 9 years ago

Everything works currently on my setup. I have all default settings other than exclusive fullscreen and audio rate control delta set to 0. Not sure why BlueNinja0 is still having an issue.

Does your monitor say it's in G-Sync mode?

nfp0 commented 9 years ago

@ToadKing I only have an issue on 50Hz games. 60Hz games are perfectly smooth, even with audio control rate on 0.005. Yes, my monitor says it's in G-Sync mode.

Are you sure you are using all the default settings? That implies you are also using "Audio sync" and not using "Limit Maximum Run Speed" in general settings. Could you please share your settings file so that I can investigate this further? I'm on the latest 1.1 Nightly by the way (build date April 28 2015), if that matters.

ToadKing commented 9 years ago

This is my settings: https://gist.github.com/ToadKing/5fb7d7288646082545fa

You'll want to use audio sync with G-sync instead of limit maximum run speed.

nfp0 commented 9 years ago

@ToadKing thanks for your help.

I was previously using "limit maximum run speed" incorrectly thinking it was an accurate way to limit the fps, but it isn't and indeed there was a minimal amount of judder. Thanks for the heads up. I've read on the RGUI that it is not accurate. But the "audio sync" gives me even more stutter. If I disable it, the picture is absolutely smooth, but of course, running at 120fps.

I've tried the exact same configuration as you. I checked my config file against yours and I can't for the life of me get it smooth. I even used the same snes core as you. I'm using this ROM to test the smoothness with the "Grid scroll test", could you give it a try? http://sourceforge.net/projects/testsuite240p/files/SNES_SFC/ But I've also tried with normal games and, albeit less noticeable, the result is the same.

I'm out of ideas. My computer is certainly able to run it, as it can run it perfectly smooth at 120fps. But here are my specs anyway: i5 3570k @ 4.5Ghz 8GB RAM GTX 680 SLI (yes I tried with SLI disabled)

I tried the MESS emulator and I managed to get perfect smoothness on both 60Hz and 50Hz games, meaning G-Sync was working correctly. But I like RetroArch more, and I wanted to use bsnes core. I'm really out of ideas.

@ToadKing What do you have set as the maximum refresh rate allowed for your G-Sync in the NVidia drivers?

bidinou commented 7 years ago

Hi everyone ! Is it supposed to work nowadays on Linux ? I have a FreeSync monitor and using Lakka (KMS). I still have issues getting perfectly smooth scrollings in some games, even after tweaking tons of settings. (recent i5 nuc with iris 450).

hizzlekizzle commented 7 years ago

it should work. If you want to open a thread on the forum about it, I'll help you troubleshoot there.

bidinou commented 7 years ago

Thanks a lot ! I created a new issue : https://github.com/libretro/RetroArch/issues/3871 The main mystery seems to be that my monitor in freesync mode sets itself around 70-75 Hz for a 60 Hz games and Retroarch detects a rate around 55-60 Hz !!

bit-bLt commented 6 years ago

For anyone who may still be unable to get the smooth scrolling they desire, here's one more setting combo to try:

Unfortunately I'm still getting very minor audio clicks every so often, but it's definitely an improvement in the motion department.

nfp0 commented 6 years ago

@SirTuttles Did you try that with non-60Hz games? Like 50Hz PAL games or arcade 75Hz games?

blurbusters commented 6 years ago

I am trying to help a user fix microstutter during variable refresh rate, and I have given them recommendations: https://forums.blurbusters.com/viewtopic.php?f=5&t=3903&p=31050#p31050

However, it is also the emulator author's responsibility to improve framepacing as much as possible, to minimize VRR microstutter.

This is what I have advised to the user, but I am cross-posting here because too many emulators fail in properly implementing VRR support. Many are already good (perhaps yours is) but I'm posting here as a public education notice.

Decision matrix: -- First, try the recommendations in Issue #1633 at RetroArch's bug tracker. If this suceeds, then you're finished, the emulator frame pacing issue is fixed. -- Failing that, try removing your framerate throttle, and use RTSS to throttle your emulator. If this succeeds, then the emulator's frame pacing is confirmed defective, and the emulator authors should read the below paragraph.

[FOR SOFTWARE DEVELOPERS] Skip this paragraph if you don't know how to develop software. NOTE: This is for other creators who visits Blur Busters Forums -- including emulator programmers / app developers / authors, please pay attention to your emulator's framepacing issues -- you want sub-millisecond timing accuracy in executing pageflips such as Direct3D Present() calls and such to avoid microstutters on VRR displays. A 1ms error in framepacing accuracy means a 1-screenwidth-per-second space shooter scroller or Super Mario scroller, will end up having a 1/1000 screenwidth microstutter (equivalent to 2 pixelwidths on a 1920x1080 display). This type of microstutter from 1ms framepacing error is noticeable when framepacing errors happens at regular intervals. For proper fluidity on VRR displays, your refresh cycles is controlled by the exact timing of the pageflip. So.....when you're in VSYNC OFF mode or VRR mode please use microsecond clocks for timing your framebuffer flips in your emulator source code!. This will reduce microstutter issues for software-triggered display refresh cycles (which is what GSYNC and FreeSync essentially is) especially for strongly fixed-Hz content such as videos and emulators. Basically when programming for a VRR display, you're now doing the equivalent of software-based VSYNC ON that's lower lag than waiting for a monitor's next fixed refresh cycle (for a non-VRR display). To successfully display fixed-Hz material (emulators, videos, etc) in a way that is as microstutter-free as true VSYNC ON, but without lag of VSYNC ON, you have to have really pristine framepacing accuracy. Do not use uncompensated millisecond-accurate timers, you may have to intentionally do a micro-busywait loop (a few tens or hundreds microseconds) within your timer event, to intentionally re-align your next frame buffer pageflip to a more exact interval since the last framebuffer flip (record a microsecond timestamp everytime you do a framebuffer flip, and align the next framebuffer flip as exactly as your software allows you to -- timer events add microstutter for fixed-Hz material on VRR displays, so pad the beginning of your timer event routine with a micro-busywait since last event, if you're using a timer event). RTSS does this sort of pageflip accuracy magic already. It's not rocket science. You might be using another method. But if you're using a timer event as your framepacing method....Then trigger your timer event approximately 1ms early and do a micro-busywait to microsecond accuracy. Timer jitter gone. No more microstutter. Problem solved. PC games don't have to worry about this as much as their framerates are designed to fluctuate, but emulators & videos are not designed to, so you have to improve your framepacing programming game, to the microseconds level in your own app/software that you create. [/FOR SOFTWARE DEVELOPERS]

(Direct quote from Blur Busters Forums)

If microsecond-accurate framerate throttling does not yet exist in RetroArch, it is an easy fix. Please open a new issue here on this system, for the good of other VRR users. It is a pervasive problem with approximately 50% of emulators, and it's amazing how big a difference a single microsecond actually is -- but it makes sense because VRR (FreeSync, GSYNC) hands over refresh cycle trigger to a software clock instead of a hardware refresh rate.

If you are doing render-and-flip all at once -- then frame rendertime fluctuation issues will cause microstutter issues when mapping an emulated fixed Hz to a VRR display. Separate your rendering from your frame flipping, instead of "render-and-flip" all at once. Basically render, then micro-busywait, then framebuffer flip (microsecond-aligned to previous flip). You may want a configurable margin (e.g. 1ms, 2ms) depending on emulator, as your padding for rendertime variances.

Many developers are unaware that the exact time of framebuffer flip immediately triggers refresh cycles (photons hitting eyeballs) when it comes to a variable refresh rate display. Your computer just (essentially) commanded a VRR monitor, "display the frame NOW" when your app executes glutSwapBuffers() or Present() or whatever. Imperfect software timers equals microstutter! So... Wisely wield your new-found power of directly commanding refresh cycles :)

If RetroArch already has microsecond-accurate frame throttling, so truly appreciated. If RetroArch does not have (best-effort) microsecond-accurate frame throttling, fix it, pretty-pretty-please.
G-SYNC and FreeSync users thank you very much for reading this public service notice --

Cheers, Mark Rejhon Founder, Blur Busters & TestUFO

P.S. TIP: VRR is good for video too. 48fps film, 59.94fps, 60fps, 120fps, 18fps, 23.97fps, 24fps, 25fps, whatever. Please verify your video players in RetroArch is fully VRR compatible. Many players configured into fullscreen mode automatically work with VRR (e.g. SMPlayer) but many video player authors unaware of VRR accidentally (in good intentions) mess things up when trying to improve things for a fixed-Hz monitor. Authors are unaware of how VRR works. VRR friendly video is easier than most think: It's simply executing frame pageflips at intervals as exactly-as-possible to the frame's timecode as possible -- and letting the VRR do the magic for you.

P.S. TIP 2: A 144Hz or 240Hz VRR Monitor is an excellent way of reducing 60Hz emulator lag. Occasionally, in certain cases, the lag savings is so huge that it can actually sometimes get less input lag than the original emulated machine connected to a slow-scanning 60Hz display. This is made possible because the individual refresh cycles are scanned-out in 1/144sec (for 144Hz VRR) or 1/240sec (for 240Hz VRR). So your 60Hz frame appears sooner than on many 60Hz displays, even though you're only running at 60fps on a high-Hz VRR display. Basically, perfect ultrasmooth 60Hz VSYNC ON appearance with most of the low-lag benefits of 144Hz/240Hz VSYNC OFF. The VRR trick for emulators (that has good frame pacing), can compensate a huge amount for emulator lag, and is part of why VRR is getting popular with emulator users as a lag-reducing technique.

nfp0 commented 6 years ago

@mdrejhon Excellent analysis! I was never aware that micro-stutter could happen this way. This might explain why it seemed I successfully configured Retroarch for VRR but still felt something was off. There was always this feeling of unevenness in the motion.

blurbusters commented 6 years ago

UPDATE: Someone has posted a potential (partial, full) solution on Blur Busters Forums

https://forums.blurbusters.com/viewtopic.php?f=5&t=3903&p=31135#p31128

Unconfirmed fix.

Reading elsewhere, it seems it might fix many modules but may not fix MAME module (due to above possible framepacing bug) according to: https://forums.libretro.com/t/gsync-please-help-solve-this-mistery/12501

blurbusters commented 6 years ago

Slightly offtopic, but related to microsecond framepacing precision capabilities:

Another suggestion for non-GSYNC is a new algorithm I have come up for emulator developers: Blur Busters Lagless Raster Follower Algorithm For Emulator Developers

Basically I've discovered a way to successfully synchronize the realworld raster to the virtual raster scanline within a ~0.5ms delay (a few scanlines worth of jittering), while looking like perfect VSYNC ON. Basically it's a tearingless VSYNC OFF that does precision raster-timed flips of the same VSYNC OFF framebuffer repeatedly -- a 1000-to-2000fps VSYNC OFF full-framebuffer virtualization of a rolling-window multi-scanlines buffer. All using 100% standard Direct3D APIs! I don't know if anyone else has done this before (and failed), but it works (at ~2000fps on current GPUs = 0.5ms raster-follower jitter margin)...

The 0.5ms window allows PC performance to seamlessly fluctuate (you can't always be perfectly microsecond exact) so you don't need perfect raster sync (line-for-line), just 0.5ms realworld raster chasebehind virtual raster, on the display scanout.

Same lag as a real console. Same lag as a FPGA "emulator". No GPU framebuffer delays!

I programmed raster interrupts on a Commodore 64 in 6502 machine language in Supermon 64, so I understand rasters well enough to understand the timing-precision needed. Things like multiplying 8 sprites to 16 and 32, and splitscreen scrolling zones. So I researched whether or not raster synchronization between real raster and virtual raster could be possible with enough of a forgiving margin for PC performance jitters. The answer is yes, with a trick.

Virtualized-realworld raster synchronization is now possible on 60Hz displays. With a clever performance-jitter-forgiving multi-scanline rolling-window trick (~0.25ms, ~0.5ms, ~1ms, can be a configurable constant) -- you don't need to be scanline-exact. That solved the problem -- now you can do realtime raster follower algorithms for virtual-vs-real rasters in emulators to reduce input lag!

I don't know if RetroArch is architectured in a way to be friendly with this level of accuracy, but this is a new algorithm for emulator developers to try -- it will be useful to melee players and other lag-critical retro gaming.

Obviously much harder than simply fixing this issue (i.e. automating the GSYNC framepacing issues to prevent the need for manual issues), but I thought I would mention this new algorithmic development.

EDIT: Beam chasing demo YouTube Video of me dragging a VSYNC OFF tearline up/down with my computer mouse: https://www.youtube.com/watch?v=OZ7Loh830Ec

OverHaze commented 6 years ago

The only way I have found to get smooth gsync is to use the Unconfirmed fix above, frame limit with Riva Tuner, turn off vsync at the driver level and then set frame delay to at least 10.

That last bit is obviously an issue. Basically you can say goodbye to using Higan with this method!

hizzlekizzle commented 6 years ago

@mdrejhon I don't think your method will work with RetroArch/libretro, since--AFAIK--the lowest granularity we can go is one frame. That is, we can't really run a scanline or few and then vblank.

OverHaze commented 6 years ago

@hizzlekizzle Is a solution to the Gsync issue something that is currently being worked on?

hizzlekizzle commented 6 years ago

@OverHaze As far as we can tell, it seems to be fine with all cores except for the MAME-git core, which has microstutters(?). I think other cores should largely be fine, from what I've heard. @bparker06 has a gsync monitor now but he hasn't gotten any actionable reports to test/fix.

blurbusters commented 6 years ago

@mdrejhon I don't think your method will work with RetroArch/libretro, since--AFAIK--the lowest granularity we can go is one frame. That is, we can't really run a scanline or few and then vblank.

Sounds like it would be hard to do without rearchitecturing for line-based hooks at this time. I'm talking to other emu devs that has expressed interest, there's now a working experimential beam chasing mod for GroovyMAME, and Toni is working on a beam chaser mode for WinUAE. So this concept is coming to multiple emulators already!

I am currently investigating beginning a cross-platform raster-polling library (inquire within if interested) -- Oculus has something similar already for lowered-latency Android VR rendering as an approximate raster can be calculated from a VSYNC heartbeat. So that is a portable beam chasing fallback for platforms that don't give you a raster register -- which apparently works, even on phones (which also have scanned displays too).

Maybe hold off until we see how complex it is in other emulators, and wait until a mature open-source cross-platform raster-poller becomes available for beam chasing applications.

Also, one doesn't need proprietary APIs for beam chasing, since minimum requirements becomes (A) access a tearing API such as VSYNC OFF or front buffer rendering (one or the other) -- as well as (B) access to VSYNC heartbeat to calculate estimated raster position; and (C) access to a microsecond clock. If A/B/C are met, it's possible to successfully beam-chase, all completely distalled down to industry standard APIs (D3D, OGL, etc). I've found software-calculated rasters can be within less than 5% of a hardware raster poll, and in some cases within 0.1% (with optional knowledge of VBI as a percentage of refresh cycle time). Proprietary knowledge of platform is only additional frosting that only serve to only improve the accuracy of beam-chasing, but is not essential. Knowing screen orientation is a bonus (e.g. for phones, tablets, etc) so you know the scan-direction of your display, but otherwise, I've come up with a way to distill beam chasing this into a portable cross-platform method, since all it is simply intelligent-steering of exact position of VSYNC OFF tearlines from merely the knowledge of VSYNC timestamps.

That said, nontheless, I think that the root cause of GSYNC microstutters should be hunted down and fixed in the MAME module of RetroArch so it doesn't need external "help" (end-user tweaking, etc). That step might also improve stutter mechanics of RetroArch when it's used in conjunction with the Low-Lag VSYNC ON Tricks that some people use.

EDIT1: I've made a YouTube video of my real-time beam chasing in cross-platform programming: https://www.youtube.com/watch?v=OZ7Loh830Ec Will release code soon, it'll probably be useful to a few emu devs at least to figure out how to mathematically accurate simulate a realworld raster register without access to a hardware raster register -- I'm successfully doing this within just a few scanlines error margin using generic programming (and my excellent know-how of of video signals + tearline behaviour + etc).

EDIT2: Update: WinUAE has really good version of my beam racing algorithm now http://eab.abime.net/showthread.php?t=88777&page=9

Three emulator developers have joined the BlurBusters Forums discussion (see page 4,5,6,7 for interesting dev chat) https://forums.blurbusters.com/viewtopic.php?f=10&t=3972

Awakened0 commented 6 years ago

@hizzlekizzle I think those MAME micro stutters were fixed when this issue was closed: https://github.com/libretro/mame/issues/28

blurbusters commented 6 years ago

No, I don't think so... only for VSYNC ON

There is a goldilocks-bug situation in some software where frametime variances fall below VSYNC ON microstuttering threshold, but doesn't completely smooth VRR microstuttering when outputting fixed-gametime-interval material like emulators to a variable-Hz display (one needs to synchronize the variability or consistentness --

Normally, variable-refresh-rate displays prosper best with variable-gametime outputting to variable-Hz, for "synchronized variability" -- see diagram. I originally made this diagram in 2013 explaining why GSYNC makes visibility of most framerate fluctuations disappear to the human eye -- object positions are in sync with eye-tracking positions, despite an erratic refresh interval -- so continually random frametimes look smooth. But emulators run content that always is a fixed interval, so you need to framepace those out at as exact intervals as possible to the output.

I still get reports of microstutters when MAME is used with GSYNC -- so maybe something else is happening. Maybe what happened is frametime variances fell to sub-refresh-cycle leagues necessary to stop VSYNC ON microstuttering -- but that frametime variances aren't being made more exact than that (to make GSYNC/FreeSync look like VSYNC ON). Seems like some tweaks fixes this, but somehow these tweaks are MAME specific, so there appears to be some kind of framepacing breakage along the chain somehow that requires an external tweak to fix, that shouldn't need to be necessary for end users (in an ideal scenario)

If the variances are less than half a refresh cycle, then VSYNC ON hides this perfectly -- VSYNC ON re-times the variable frame delivery to perfect consistently timed frame delivery. But VRR monitors will merrily display inconsistently-delivered frames. So if an emulator is jittering-out frame Present()'s to the display, stutters will show-through on VRR.

I don't know what the root cause of the bug (hell, I'm even open to it might not even be RetroArch itself -- just the weird pattern of things being smooth for other emulator modules but not for MAME) -- seems to hopefully point that it's worth eventually finding the root cause and fixing it -- "probably a simple fix but hard-to-find" situation for any developer that don't have a VRR test display.