Closed thrust26 closed 3 years ago
Right now, any change to a dialog (which has an FBSurface
behind the scenes) requires redrawing the whole dialog. This is because it's just as fast to redraw a surface than it is to only redraw parts of it. For OpenGL, it has to do with calling glTexImage2D
once on the whole surface, vs. (potentially multiple) calls to glTexSubImage2D
on many parts of the surface. Direct3D has something similar.
So, if we conclude that we'll only ever draw the entire dialog rather than parts of it, then it makes no sense for each widget to track its own dirty state. If it is dirty, then by extension the whole surface is dirty, and it can just inform the dialog of that fact. And that's exactly what the current code does. Widgets set a flag in Dialog
that the dialog needs to be redrawn. Turning this logic around, and having the dialog query each widget for its 'dirtiness' is slower, and in the end results in the exact same thing; the entire dialog is redrawn.
For the debugger, it happens that the entire thing is a dialog (aka, DebuggerDialog
), so redraws happen on the entire thing.
Long story short; if in the end a widget being dirty implies a dialog needs to be redrawn, then the code already works that way, and we gain absolutely nothing by changing how it works.
As for regular querying, this happens in DialogContainer::updateTime
. At regular intervals, 'time' is updated in that class, which then can pass it on to its constituent dialogs (and from there to its constituent widgets). That's how we currently check whether a mouse click is still being held down. It should be possible to add code to this class to send 'time' info to dialogs/widgets (only if they register that they want it, though!) from this class. And that would involve not having to touch the rendering subsystem at all.
Also, whether something is dirty and whether it should animate are two different things. Even if we reworked the widgets to have their own dirty state, they'd still need to know when to animate. Hence they need time information, as mentioned above. Once they have that, it doesn't matter if the widget tells the dialog, or the dialog queries the widget; the redraw needs to happen because its time has come to happen.
Right now, any change to a dialog (which has an FBSurface behind the scenes) requires redrawing the whole dialog. This is because it's just as fast to redraw a surface than it is to only redraw parts of it.
OK for bringing the surface to display. But the actual drawWidget
methods require CPU time too.
Right; it's a toss-up. We would need to test. On older systems, the overhead of doing multiple calls to upload parts of a surface/texture was much greater than any drawing done locally, into the buffer. Maybe that's no longer true.
But the other point still stands. Even if we change the code to work that way, it still doesn't help with animations unless the widget has access to 'time' info. And the dialog already has access to that info, so it should be able to pass it to the widget.
On older systems, the overhead of doing multiple calls to upload parts of a surface/texture was much greater than any drawing done locally, into the buffer. Maybe that's no longer true.
We could draw the widget individually and then still update the whole surface.
But the other point still stands. Even if we change the code to work that way, it still doesn't help with animations unless the widget has access to 'time' info. And the dialog already has access to that info, so it should be able to pass it to the widget.
The widget would have access to time info, because it would be polled in regular intervals if it is dirty. So it can tick its own internal timer.
OK, you've convinced me. But I think if you want to attempt this, please do it in a branch first. It has the potential to break things, and will require a lot of testing. Or maybe I'm just overestimating how complicated it will be again :smiling_imp:
This is definitely worth a branch.
This might also allow mouse overs.
Works quite well already.
Known bugs:
RomInfoWidget
drawn at left in LauncherDialog
if first selected game has no screenshot~DebuggerDialog
if other dialog is in front~TimeMachineDialog
not displayed on focus change~DrawWidget
, PopupWidget
, ContextMenu
...)~PopupWidget
after losing focus~ContextMenu
redraws with every mouse move~ContextMenu
sometimes shades its dialog~Dialog
the screen is not updated until next event~RomListWidget
fully redraws with when focus is gained~ (cannot be avoided)RomListWidget
problematic~RomListWidget
~StringListWidget
~PopupWidget
(ContextMenu
) with Escape requires a mouse click on dialog for events/redraws to happen (no regression, old bug)ToDos:
Good work so far. It's not surprising there are some issues. The old code was tuned/tested for quite some time, so we will have to do this again. And undoubtedly there will be some assumptions that are no longer valid. But it will make drawing use less CPU, so that's great.
How complicated would it be to give each dialog its own surface? Then we can save a lot of redraw when closing a dialog or context menu.
It already has it. This is in the private instance variable list for Dialog
:
shared_ptr<FBSurface> _surface;
There are also still issues with double-buffering, and PopupWidget
in particular. Drawing the 'menu' sometimes doesn't show the underlying dialog.
Also, switching to fullscreen sometimes has garbage outside the dialog area (probably missing a clear()
somewhere). Some platforms don't zero video memory, so we have to take care of it.
I see now too. Instead of drawing all the bottom dialogs when closing the top one, rendering their surfaces should do. The current code couples both actions.
I have experimented with that now, but I did not notice the problems your described.
BTW: Do you know how I can darken a surface? The scanlines seem to do that, no? I suppose the process is reversible?
I see now too. Instead of drawing all the bottom dialogs when closing the top one, rendering their surfaces should do. The current code couples both actions.
Yes, the code couples drawing and rendering. Re-rendering is all that's needed; redrawing only needs to happen when a widget itself changes. This is a holdover from software rendering, where there were no surfaces, and a dirty widget meant a redraw was always required. There was no concept of 'render surfaces' at all.
I have experimented with that now, but I did not notice the problems your described.
Yes, some of these things only happen in Windows, some in Linux, and some different ones in Mac. That's why I mentioned that much testing will be required. And it may also explain some of the 'strangeness' of the current code; the behaviour of video drivers isn't consistent on different platforms.
BTW: Do you know how I can darken a surface? The scanlines seem to do that, no? I suppose the process is reversible?
What do you mean by darken? If like scanlines, then it's just a texture with alpha set to a certain level. Also not sure what you mean by reversible. Just re-render with a different alpha level, I guess.
I will commit my changes regarding surfaces (today or tomorrow) and then we can start testing.
With "darken" I mean that dialogs in the background should become darker and back to normal brightness when they are on top again. Not sure if I will keep it like that, current Windows only changes the color of the title bar and blinks the title of the active dialog when you click an inactive dialog. But I don't like that very much, I want to know before I click wrong. :smile:
Anyway, just to be sure: For darkening, I create a new surface per dialog and render it on top of that dialog.
Anyway, just to be sure: For darkening, I create a new surface per dialog and render it on top of that dialog.
Yes, that's basically how scanlines work:
// Generate scanline data, and a pre-defined scanline surface
constexpr uInt32 scanHeight = TIAConstants::frameBufferHeight * 2;
std::array<uInt32, scanHeight> scanData;
for(uInt32 i = 0; i < scanHeight; i += 2)
{
scanData[i] = 0x00000000;
scanData[i+1] = 0xff000000;
}
mySLineSurface = myFB.allocateSurface(1, scanHeight, interpolationModeFromSettings(myOSystem.settings()), scanData.data());
I suppose if you want to get fancy, you could just create a 1x1 pixel surface filled with 0xFF, and then render it on top of the dialog with the appropriate alpha level. Note that in the above code, the scanline texture is only 1 pixel wide. It is stretched horizontally, so we save memory by not having to allocate the entire width. It is 'stretched' over the surface as it's rendered, and this is essentially free (hardware scaling).
And to be even more fancy, you only really need one such surface, which can be shared by all dialogs. It can be a static class variable that is shared amongst all dialogs.
Thanks.
Cool idea with the single, stretched surface. That explains why the scanline surface has only one dimension.
You're not the only one that likes to optimize things :smiley:
@sa666666 I optimized that much, that now when I close a dialog (DialogContainer::removeDialog()
), I don't have to draw anything again. :smile:
But now the screen is not updated until the next draw happens. And I haven't found out how to enforce an update. :unamused:
I guess the only way to do it is have a full refresh on every state change (when a dialog is opened or closed, toggling windowed/fullscreen mode, etc). It may not be 100% efficient, but at least we will see the output :smile:
I think this is probably related to the 'garbage' I see when switching to fullscreen mode. A clear()
is missing, so it's showing the old buffer contents.
I'm pretty sure that's why the old code re-render the entire dialog stack when a dialog is closed. Essentially, when a dialog is closed, we don't know what was obscured, so best to just re-render the entire thing.
FrameBuffer::update(true)
is supposed to enforce a full redraw or re-render, no matter if things are dirty or not. But during refactoring, you may have changed the intent here??
Yes, that was changed intentionally. There are no pending draws, just the dialog surfaces were rendered again. Now I want to make this visible. And I tried to call myBackend->renderToScreen()
directly, but this had no effect.
You still need to call FBSurfaceSDL2::render()
for all surfaces. The method FBBackendSDL2::renderToScreen()
just finalizes all the renders up to that point, and 'pushes' them all out to the video hardware. So if you haven't called render
at some point, renderToScreen
won't actually do anything, I think. The code is:
// - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
void FBBackendSDL2::renderToScreen()
{
ASSERT_MAIN_THREAD;
// Show all changes made to the renderer
SDL_RenderPresent(myRenderer);
}
Note the 'show all changes' comment. Without calling render on surfaces, there are no changes, so nothing happens.
Not sure if this is what the problem is. But it's a place to start.
Well, if you look at DialogContainer::removeDialog()
I am rendering all surfaces again.
Sorry, I am going only by your comments; I haven't had time to look at the code. Maybe later this evening.
On line 179 in DialogContainer::removeDialog()
, the setDirty()
part is commented out. I understand why this is; to eliminate spurious redraws. However, adding it back allows the dialog to 'disappear' as it's supposed to.
So there is still some interaction between drawing and rendering; they are not fully separated yet. So doing a draw and then a render is working, but a render by itself isn't. So now we just need to track down what the difference is.
Or, maybe we just have to accept that a full redraw and render is required when closing a dialog. It's not that inefficient, after all.
Yes, it seems that fully not changed surfaces may not be rendered at all. But I found nothing in the documentation. However, there must be a way, I don't give up easily.
There are also cases where a redraw is not happening when it should, resulting in garbage data in the buffer. For example:
void DialogContainer::draw(bool full)
{
if(myDialogStack.empty())
return;
cerr << "draw " << full << " " << typeid(*this).name() << endl;
// Make the top dialog dirty if a full redraw is requested
if(full)
myDialogStack.top()->setDirty();
// Draw and render all dirty dialogs
myDialogStack.applyAll([&](Dialog*& d) {
if(d->needsRedraw())
d->redraw();
});
}
Here, the semantics of the draw, when full
is true, is supposed to be 'force a full redraw'. In that case, the draw shouldn't depend on needsRedraw()
, but also on full
. So perhaps something like:
void DialogContainer::draw(bool full)
{
if(myDialogStack.empty())
return;
cerr << "draw " << full << " " << typeid(*this).name() << endl;
// Draw and render all dirty dialogs
myDialogStack.applyAll([&](Dialog*& d) {
if(full)
d->setDirty();
if(d->needsRedraw())
d->redraw();
});
}
There are also cases where a redraw is not happening when it should
Can you give me an example?
I can trigger it pretty consistently, but don't know how to explain it. A video would be better. Perhaps I can make a video on my camera?
Also, and you're probably not seeing this fact yet; this stuff works differently depending on the backend. So what's working for you in Direct3D may not work for me in OpenGL. And I tested on Mac using Metal, and it seems different in some cases too. That's why I initially had the redraws happening with the renders all the time; the behaviour doesn't seem to be consistent on all platforms.
Yes, the various renderers and platforms need more testing.
Anyway, I tried something again, and suddenly it works!
void DialogContainer::removeDialog()
{
if(!myDialogStack.empty())
{
myDialogStack.pop();
if(!myDialogStack.empty())
{
// Rerender all dialogs (TODO: top dialog is rendered twice)
myDialogStack.applyAll([&](Dialog*& d){
//d->setDirty();
d->render();
});
myOSystem.frameBuffer().renderToScreen(); // <- this!
}
}
}
I can trigger it pretty consistently, but don't know how to explain it. A video would be better. Perhaps I can make a video on my camera?
Sure.
This happened after exiting Game Info dialog, Alt-tabbing away from the window, then back again. It seems the launcher is drawn, then the dialog. Then the dialog again. And finally the launcher draw is called, but it isn't dirty, so nothing happens.
I've run into something like this before. Basically, there is double-buffering going on here, and if you draw an underlying menu/dialog into one buffer and then switch to another to draw the top-level dialog, the bottom-level one may not be drawn. Sometimes it gives a black screen (like here), sometimes garbage. But the reason is that you've swapped the buffer without drawing in it. Every time renderToScreen()
is called, it swaps to a new buffer. And if that buffer isn't fully redrawn, then we get this. That's why I was drawing all the dialogs each time any one of them changed. I never did figure out a way around that.
Yes, the various renderers and platforms need more testing.
Anyway, I tried something again, and suddenly it works!
void DialogContainer::removeDialog() { if(!myDialogStack.empty()) { myDialogStack.pop(); if(!myDialogStack.empty()) { // Rerender all dialogs (TODO: top dialog is rendered twice) myDialogStack.applyAll([&](Dialog*& d){ //d->setDirty(); d->render(); }); myOSystem.frameBuffer().renderToScreen(); // <- this! } } }
That renderToScreen()
call is happening at the end of FrameBuffer::update()
. If it isn't being called there, it means the change in state in DialogContainer::removeDialog()
isn't being properly communicated back to the update()
call.
Yes, that's something I had seen before. But I am optimistic.
BTW: I found that OpenGL has problems with the shading surface. It goes on an off when you tab through the OptionDialog
(or move the mouse). Maybe something similar as you described.
That renderToScreen() call is happening at the end of FrameBuffer::update(). If it isn't being called there, it means the change in state in DialogContainer::removeDialog() isn't being properly communicated back to the update() call.
Not sure what you want to tell me here. What would the update()
have to know?
But instead of renderToScreen
in removeDialog
, I could provide the FrameBuffer
with a flag, which forces renderToScreen
in update
. Maybe that helps with the double buffering.
Basically, at the end of FrameBuffer::update()
, it tests whether a change has really happened, and swaps to a new buffer (that's what renderToScreen
does). I guess it made sense in the original design to have only the FrameBuffer::update()
swapping buffers, after all drawing/rendering is done. Having it happen in other classes seems to be exposing an implementation detail of now the FrameBuffer
class works (that's the reason why that functionality isn't public). Perhaps a way to tell FrameBuffer
to invalidate and update is better?
Perhaps a way to tell FrameBuffer to invalidate and update is better?
Just what I said some seconds before. :smile:
Anyway, bedtime for me now. Thanks for the help.
OK, goodnight.
BTW, I think there was a FrameBuffer::invalidate()
at some point. Not sure if it was removed in my rework or yours. We need something like that again, I guess.
One more thing: OpenGL seems to randomize the surfaces if you put them on screen without rendering them all again. Something to test tomorrow.
Here's a video of switching to fullscreen in emulation mode, then opening the debugger, and finally the RAM search dialog (so the cursor will tick). Ignore the small size of the UI; I've intentionally turned off HiDPI mode:
Watch it swap between emulation in background, debugger windows shaded and then not shaded. The top-most dialog is being drawn correctly; everything else isn't. I fear we're going down a rabbit hole here :smiling_imp:
Mac does something similar, except the background is often red instead of black.
Actually, I'm seeing similar behaviour in Linux, Windows and Mac. And this is all different hardware and video cards. And different drivers (Direct3D, OpenGL, Metal). So at least we are consistent 😈 But this is actually a good thing, since it indicates a problem in the code, and not something platform-specific. The latter is always so hard to track down. So perhaps when we fix the issues on one system, it will be taken care of on another too.
BTW, not to sound discouraging, but this is why I avoided the UI code after working on it for so long. I really started to get burnt out.
I think one of the main issues is that we must not call renderToScreen
unless something has actually changed. It will swap buffers, and the new one won't necessarily match the old one. And we can't depend on something rendered to one of them being present in the other.
I believe the way the old code worked is that if anything changed, the entire dialog stack had to be redrawn. Now that drawing is separated from rendering, it applies for the latter only:
FrameBuffer::update()
, then the changes are flushed and the buffers swapped with renderToScreen
. Conversely, if nothing has changed, then the buffer shouldn't be swapped.This is what we're seeing here. Top-level dialogs are being drawn correctly, and underlying ones are not. That's why we're getting garbage, old contents, red/black colours, etc. When we re-render, we have to re-render everything. And we should only call renderToScreen
once per framebuffer update, and only if re-rendering has occurred.
@thrust26, I know this seems slightly less efficient, but I think it's the only way it can work. At least we've separated drawing from rendering, and only do the former when necessary. But the latter has to happen on any change, and it has to happen to everything on the screen.
TL;DR: All the re-renders have to happen together, in the same update, and be pushed to the screen at the end of that update. Whether the surfaces have actually changed (by widgets drawing into them) is irrelevant. Dirty surfaces imply a re-render is required. Double-buffering requires all re-renders to be done together, even ones underneath.
When we re-render, we have to re-render everything
Yup, that was my point too. Fingers crossed, I will try now.
@sa666666 I have changed the rendering to what you suggested. This now looks fine with all 4 renderers on I have. I hope this works for you too.
I'm in a lab now, and will be for much of today. So I'm not going to have a lot of time to test this until this evening (10 hours from now).
At preliminary first glance, testing on Mac, all the re-render issues are gone. 👍 So that's great. I still need to test this in Linux, but I suspect it will be the same.
Now some bugs:
clear()
. In fact, it might be better to just do a clear()
whenever a sequence of re-renders happen, to be sure the buffer is clean.FrameBuffer::setUIPalette()
:
if(&myOSystem.eventHandler())
update(true);
What is the purpose of this code? Perhaps I can rework it so it doesn't crash, if I know what the intent is.
Currently the whole UI is redrawn when a widget changes (becomes "dirty"). It might(!) be more effective if only the "dirty" widgets are redrawn.
This would also allow the (active) widgets to animate themselves (e.g. animated cursor, scrolling labels when there is not enough space or some other fancy stuff ), since they can be queried in regular 60 Hz intervals then.
The code changes required should be quite small, but the devil is in the detail.
isDirty
method to eachGuiObject
setDirty
to update an internal state instead of informing the dialogisDirty
for all its widgets