libretro / dosbox-svn

GNU General Public License v2.0
6 stars 17 forks source link

Screen tearing #7

Closed andres-asm closed 4 years ago

andres-asm commented 4 years ago

I'm looking for some help to solve the horrible screen tearing we have in this core (I have an horizontal line running across the screen at vertical scrolling in games like jazz jackrabbit).

I didn't write this core, I just cleaned it up and rebased it on top of DOSBox-SVN so it's easier to maintain. Added tons of features but hardly touched the rendering code

image

Anyway, the whole rendering code is quite simplistic, it's basically this:

#include <string.h>
#include "libretro.h"
#include "dosbox.h"
#include "video.h"

Bit8u RDOSGFXbuffer[1024*768*4];
Bitu RDOSGFXwidth, RDOSGFXheight, RDOSGFXpitch;
unsigned RDOSGFXcolorMode = RETRO_PIXEL_FORMAT_0RGB1555;
void* RDOSGFXhaveFrame;

Bitu GFX_GetBestMode(Bitu flags)
{
    return GFX_CAN_32 | GFX_RGBONLY;
}

Bitu GFX_GetRGB(Bit8u red,Bit8u green,Bit8u blue)
{
    return (red << 16) | (green << 8) | (blue << 0);
}

Bitu GFX_SetSize(Bitu width,Bitu height,Bitu flags,double scalex,double scaley,GFX_CallBack_t cb)
{
    memset(RDOSGFXbuffer, 0, sizeof(RDOSGFXbuffer));

    RDOSGFXwidth = width;
    RDOSGFXheight = height;
    RDOSGFXpitch = width * 4;

    if(RDOSGFXwidth > 1024 || RDOSGFXheight > 768)
        return 0;

    return GFX_GetBestMode(0);
}

bool GFX_StartUpdate(Bit8u * & pixels,Bitu & pitch)
{
    pixels = RDOSGFXbuffer;
    pitch = RDOSGFXpitch;

    return true;
}

void GFX_EndUpdate( const Bit16u *changedLines )
{
    RDOSGFXhaveFrame = RDOSGFXbuffer;
}

// Stubs
void GFX_SetTitle(Bit32s cycles,Bits frameskip,bool paused){}
void GFX_ShowMsg(char const* format,...){}
void GFX_Events(){}
void GFX_SetPalette(Bitu start,Bitu count,GFX_PalEntry * entries){}

We're basically getting a pointer to the framebuffer and blitting that. AS IS. I see that whoever wrote this is not doing anything with changedlines, so I thought that may be the problem so that's one possibility.

The libretro core is leveraging a co-threading library (libco), so basically dosbox is running in a co-thread and we're escaping to the libretro thread on every retro_run interval to get a finished frame. Maybe we're not doing it when the actual frame is complete so that's another possibility.

andres-asm commented 4 years ago

Ok after messing around problem seems to be.. the scheduler. We switch contexts at 1/fps With this change we get no tearing but bad frame pacing

$ git diff
diff --git a/libretro/libretro.cpp b/libretro/libretro.cpp
index 4b2a4fa2..91d39ac2 100644
--- a/libretro/libretro.cpp
+++ b/libretro/libretro.cpp
@@ -1391,10 +1391,6 @@ void retro_run (void)
         /* Run emulator */
         co_switch(emuThread);

-        /* Upload video */
-        video_cb(RDOSGFXhaveFrame, RDOSGFXwidth, RDOSGFXheight, RDOSGFXpitch);
-        RDOSGFXhaveFrame = 0;
-
         /* Upload audio */
         audio_batch_cb((int16_t*)audioData, samplesPerFrame);
     }
diff --git a/libretro/libretro_gfx.cpp b/libretro/libretro_gfx.cpp
index 1baf21d9..204c86e9 100644
--- a/libretro/libretro_gfx.cpp
+++ b/libretro/libretro_gfx.cpp
@@ -7,6 +7,10 @@ Bit8u RDOSGFXbuffer[1024*768*4];
 Bitu RDOSGFXwidth, RDOSGFXheight, RDOSGFXpitch;
 unsigned RDOSGFXcolorMode = RETRO_PIXEL_FORMAT_0RGB1555;
 void* RDOSGFXhaveFrame;
+float RDOSfps;
+
+
+extern retro_video_refresh_t video_cb;

 Bitu GFX_GetBestMode(Bitu flags)
 {
@@ -25,6 +29,7 @@ Bitu GFX_SetSize(Bitu width,Bitu height,Bitu flags,double scalex,double scaley,G
     RDOSGFXwidth = width;
     RDOSGFXheight = height;
     RDOSGFXpitch = width * 4;
+    RDOSfps = 60.0f;

     if(RDOSGFXwidth > 1024 || RDOSGFXheight > 768)
         return 0;
@@ -43,6 +48,8 @@ bool GFX_StartUpdate(Bit8u * & pixels,Bitu & pitch)
 void GFX_EndUpdate( const Bit16u *changedLines )
 {
     RDOSGFXhaveFrame = RDOSGFXbuffer;
+    video_cb(RDOSGFXhaveFrame, RDOSGFXwidth, RDOSGFXheight, RDOSGFXpitch);
+    RDOSGFXhaveFrame = 0;
 }

 // Stubs

I figure the rest of the logic may need changes to work under this new behavior of sorts.

realnc commented 4 years ago

I can explain the stutter. It's only there when using g-sync. This is how g-sync behaves when there's no buffer flips happening anymore ("FPS stall"). If the application stops presenting frames and you get into a 0FPS situation for over a second or so, g-sync seems to stall completely. After the application starts presenting frames again, g-sync has a recovery period of about one second where it stutters severely as it tries to figure out the frame pacing again. Enabling the FPS indicator in RA shows that the core stops preventing frames when there's no changes on the screen. Normally, the FPS indicator would stay relatively fixed at 70FPS in most situations, or around 60FPS with games that switch to a 60Hz mode. However, with this patch, the FPS indicator will show very low FPS values when the image is static and there's no changes happening.

G-sync exhibits this stutter behavior in general with all games. However, the vast majority of games do not stall like that during gameplay, they only sometimes stall in static loading screens and such, which makes the effect unnoticeable, unless the game does not use a small grace period after stalling to warm-up the rendering pipeline. (Exiting the map or inventory screen in The Witcher 3 for example shows this severe g-sync stutter due to recovering from 0FPS.) The problem is worse when using g-sync + v-sync, less noticeable with g-sync + v-sync OFF (and an FPS cap to keep the game's FPS below the maximum Hz of the display.)

It seems DOSBox in general only presents new frames if something actually changed on the screen. If looking at a static screen, DOSBox will basically output 0FPS because nothing changes in the image. However, in VRR mode, the application is in control of the refresh rate so it needs to keep presenting frames even if they're the same. If the application stops presenting frames, you basically get 0Hz and g-sync will stall and needs to recover when frame presentation resumes.

Without this patch, the stutter is not there because as you mentioned "we switch contexts at 1/fps", so g-sync does not stall. However,the context switch is out of sync with dosbox and thus there's tearing.

When disabling g-sync (running in windowed mode in Linux will prevent g-sync), there is no stutter because the refresh rate is fixed regardless of whether the core presents frames or not. Of course you then get the usual stutters due to the mismatch between real refresh rate and emulated refresh rate.

I'm not sure how easy it is to fix this, but I hope my explanation above is of help.

andres-asm commented 4 years ago

It stutters a lot and flat out locks on certain games with this even without g-sync.

realnc commented 4 years ago

Yeah, you're right.

I think I fixed it by using two frame buffers. I reverted your patch and did the following instead:

diff --git a/libretro/libretro.cpp b/libretro/libretro.cpp
index 4b2a4fa2..1f093429 100644
--- a/libretro/libretro.cpp
+++ b/libretro/libretro.cpp
@@ -143,7 +143,7 @@ static bool is_restarting = false;
 extern Bit8u RDOSGFXbuffer[1024*768*4];
 extern Bitu RDOSGFXwidth, RDOSGFXheight, RDOSGFXpitch;
 extern unsigned RDOSGFXcolorMode;
-extern void* RDOSGFXhaveFrame;
+extern Bit8u RDOSGFXhaveFrame[sizeof(RDOSGFXbuffer)];
 unsigned currentWidth, currentHeight;
 float currentFPS = 60.0f;

@@ -1393,7 +1393,6 @@ void retro_run (void)

         /* Upload video */
         video_cb(RDOSGFXhaveFrame, RDOSGFXwidth, RDOSGFXheight, RDOSGFXpitch);
-        RDOSGFXhaveFrame = 0;

         /* Upload audio */
         audio_batch_cb((int16_t*)audioData, samplesPerFrame);
diff --git a/libretro/libretro_gfx.cpp b/libretro/libretro_gfx.cpp
index 1baf21d9..6e52aaa7 100644
--- a/libretro/libretro_gfx.cpp
+++ b/libretro/libretro_gfx.cpp
@@ -6,7 +6,7 @@
 Bit8u RDOSGFXbuffer[1024*768*4];
 Bitu RDOSGFXwidth, RDOSGFXheight, RDOSGFXpitch;
 unsigned RDOSGFXcolorMode = RETRO_PIXEL_FORMAT_0RGB1555;
-void* RDOSGFXhaveFrame;
+Bit8u RDOSGFXhaveFrame[sizeof(RDOSGFXbuffer)];

 Bitu GFX_GetBestMode(Bitu flags)
 {
@@ -42,7 +42,7 @@ bool GFX_StartUpdate(Bit8u * & pixels,Bitu & pitch)

 void GFX_EndUpdate( const Bit16u *changedLines )
 {
-    RDOSGFXhaveFrame = RDOSGFXbuffer;
+    memcpy(RDOSGFXhaveFrame, RDOSGFXbuffer, sizeof(RDOSGFXbuffer));
 }

 // Stubs

This fixes tearing. Not sure if this can be done better.

andres-asm commented 4 years ago

Ah that's cool! maybe tearing is being caused by the changes during blitting then!

andres-asm commented 4 years ago

yup, thanks a lot! it probably affects performance a bit but seems negligible.

Sooo what was happening is that because it wasn't synced, another frame was being drawn on the same framebuffer on top of the current one.

Nice solution!. Want to send a PR? or should I push it?