ppy / osu-framework

A game framework written with osu! in mind.
MIT License
1.66k stars 418 forks source link

Rendering extensibility limitations #5343

Open Flutterish opened 2 years ago

Flutterish commented 2 years ago

This specific issue will document all issues I encountered while making the o!f-xr (/overhaul) extension and what could be done to make o!f more friendly towards outside extensions. I will order them by "severity/difficulty" from least to most. I do not consider "merging" the entensions' code into the main framework as a valid solution.

frenzibyte commented 2 years ago
  • With the introduction of IRenderer, frame buffers only support a 16bit depth attachment. (I was previously using 32 bit depth). It would be appreciated if the attachments matched, or at least approximated what was previously possible. I am willing to write the code to support this myself, if you like.

Feel free to PR any attachment type you require, should be a simple D32 addition to the existing RenderBufferFormat enum.

  • o!f-xr uses panels, a 2D context nested on top of a 3D mesh. It currently requires me to do reflections to get the 2d draw node hierarchy for a given subtree index. GenerateDrawNodeSubtree should be public, or at the very least protected.

Can you elaborate further on how you find GenerateDrawNodeSubtree to be helpful rather than using CreateDrawNode? Unless you're referring to the instantiated DrawNodes array. I'm not 100% sure about exposing them without knowing your specific use-case.

  • Some programs might depend on specific backends for various reasons (optimizations, shaders, integrations with other software etc.). I think it should be possible to force the framework to use a specific type of renderer, although I think this is planned, at the very least at user-setting level.

We plan to add support for other backends and use most suitable one based on the user's environment, but I wouldn't mind moving the CreateRenderer method to Game, in order to allow specific games to use specific/custom renderers (@smoogipoo thoughts?)

  • o!f-xr integrates with OpenXR, which requires both knowing what renderer is used, and an ability to get the native texture pointer. I think IRenderer could support some kind of enum (which backend it uses, like GL) and a method to retrieve renderer-specific data. I would suggest: ...

These remaining points I don't really know about them. If you wish to use specific OpenGL renderer classes and optimise around it then you're better off rolling up your own OpenGL/OpenXR renderer implementation that has its classes exposed and optimised to your liking? The existing classes shouldn't be tinkered with externally and the least bit of changes may very well break your entire rendering logic.

Flutterish commented 2 years ago

Can you elaborate further on how you find GenerateDrawNodeSubtree to be helpful rather than using CreateDrawNode? Unless you're referring to the instantiated DrawNodes array. I'm not 100% sure about exposing them without knowing your specific use-case.

The 3D Panel has its own triple buffer of 2D draw nodes it renders on itself (quite similar to ProxyDrawable?) (it cant use the existing one as the 2D and 3D hierarchies are separate because they have fundamentally different draw logic - more specifically the 3D draw node hierarchy is flattened at the Scene.RenderPiepline level using an optimised HashSet/List hybrid. There is also the consideration for not needing the data 2D draw nodes have).

These remaining points I don't really know about them. If you wish to use specific OpenGL renderer classes and optimise around it then you're better off rolling up your own OpenGL/OpenXR renderer implementation that has its classes exposed and optimised to your liking? The existing classes shouldn't be tinkered with externally and the least bit of changes may very well break your entire rendering logic.

I dont specifically care for OpenGL in specific, its more the case that the OpenXR integration requires me to pass textures to render to the VR headset, for which it needs the renderer type (GL, Vulkan, Direct, etc) and a texture id.

Flutterish commented 2 years ago

Oh, to elaborate on the first point, GenerateDrawNodeSubtree has its own logic like updating the draw nodes. This works with internal invalidation ids I would rather not touch because thats even more reflections.

Flutterish commented 2 years ago

Ah, sorry, I think the explaination was quite hectic. Let me try again: The Panel has a source Drawable. This drawable has its subtree of 2d draw nodes, which I retrieve with GenerateDrawNodeSubtree for the reason that it has internal logic specific to that drawable. The subtree of that Drawable is not attached to any parent, as I cant mix the 2D and 3D draw node hierarchies as explained before. This means it also has its own triple buffer which gives it subtree ids to use, and that GenerateDrawNodeSubtree is required to update the state of the 2d draw nodes. You can think of this as a nested GameHost.UpdateFrame/DrawFrame procedure.

Flutterish commented 2 years ago

We plan to add support for other backends and use most suitable one based on the user's environment, but I wouldn't mind moving the CreateRenderer method to Game, in order to allow specific games to use specific/custom renderers (@smoogipoo thoughts?)

This begs one question though, what about nested games (say, visual tests)? How would we render the nested game if they use different renderers? Would that even be allowed/considered?

frenzibyte commented 2 years ago

For simplicity, only the game instance that the host runs on will have its renderer applied to all nested games.

peppy commented 2 years ago

Some programs might depend on specific backends for various reasons (optimizations, shaders, integrations with other software etc.). I think it should be possible to force the framework to use a specific type of renderer, although I think this is planned, at the very least at user-setting level.

At most that would be a preference. There is zero guarantee and zero intention to provide a guarantee of a specific renderer, but you could easily check the renderer on startup and show an error if it's not the one you want to work with.

Also, please note that there's an edit button on github – we generally prefer posting once rather than four times in a row.

smoogipoo commented 2 years ago

I don't have a complete view of how your project works, but going based on the things written in this thread, it sounds like a very custom implementation of everything. I'm not ready to add hacks for custom implementations (such as exposing GenerateDrawNodeSubTree()) and suggest that it's better for you to fork o!f as a whole instead.

As for individual points...

frame buffers only support a 16bit depth attachment. (I was previously using 32 bit depth)

As above, PR or request it in an issue thread.

GenerateDrawNodeSubtree should be public, or at the very least protected

Not happening for now, as above.

I think it should be possible to force the framework to use a specific type of renderer

Could be done, but I also envision a world in which we have fallback renderers - try D3D12, then D3D11, then Vulkan, then OpenGL. Or something like that. Maybe even D3D9 if there are any such users.

I do think a lot of the issues you're having could be resolved via a custom renderer, such as the "reset of the renderer stack".

and an ability to get the native texture pointer

Not happening. I suggest a custom renderer.

which requires me to reset and restore the state of the backend

Need more information on what you're requesting here. Is it resetting the entirety of the renderer a-la calling OpenGLRenderer.BeginFrame(): https://github.com/ppy/osu-framework/blob/8943d4fe5223489f3d6cbf0bb6661256cd9b6178/osu.Framework/Graphics/OpenGL/OpenGLRenderer.cs#L178-L211 Or more specific as you have there - resetting only bound shaders and vbo/vao.

Generally I'm also again this, and recommend exposing ways to do what you're going for - such as maybe being able to bind a null buffer or to be able to bind VAOs.

it should be possible to construct the frame buffer through the renderer (cast to OpenGlRenderer) with OpenGL buffer formats as arguments

Also not happening. The point of this entire change was to reduce exposure of platform-specific types. I'm not about to add them back.

OpenXR integration requires me to pass textures to render to the VR headset, for which it needs the renderer type (GL, Vulkan, Direct, etc) and a texture id

As above. When we move to Veldrid, we won't be able to give you a texture ID because Veldrid also doesn't expose that. Your best bet is a custom renderer.

If you want, you can use reflection as you're already doing to hack through undocumented and platform-specific implementations.

Flutterish commented 2 years ago

I don't have a complete view of how your project works, but going based on the things written in this thread, it sounds like a very custom implementation of everything.

The reason for this is that o!f simply does not provide me with the tools I need. Shaders are created with autogenerated wrappers that mess with depth, I can't create VAOs and other buffers, if I wanted to create materials they would need access to shader uniforms which are internal, etc.

I'm not ready to add hacks for custom implementations (such as exposing GenerateDrawNodeSubTree())

Would it perhaps be possible to create a "GameHost-like" drawable which would contain its own triple buffer and an isolated UpdateFame/DrawFrame procedure? I think that would solve that problem more or less. I havent checked, but I'm pretty sure ProxyDrawable does this already.

Need more information on what you're requesting here. Is it resetting the entirety of the renderer a-la calling OpenGLRenderer.BeginFrame():

Something similar, but with an ability to restore that state later. I envision something like

renderer.BeginContext(); // flushes current batch, resets the stacks and state to the initial state
DrawStuff();
renderer.FinishContext(); // restores the stacks and state from before BeginContext

This would guarantee a clean state when the drawing does something very custom, like rendering a whole scene to a frame buffer which essentially is like a whole separate window.

Or more specific as you have there - resetting only bound shaders and vbo/vao.

This is only the 3D to 2D context switch. The 2D to 3D pretty much sets the whole GL state to something custom (although without any hacks this time), and similarly Panels (2D embeded on the surface of a 3D mesh) reset most of the state.

[...] and suggest that it's better for you to fork o!f as a whole instead.

Unfortunately, not an option. I require o!f-xr to be able to run both inside and around an existing o!f application (rulesets, and osu!xr). Like I said before I dont consider having access to your internals a valid solution, and certainly that will not help with future similar problems if someone else does this.

As above. When we move to Veldrid, we won't be able to give you a texture ID because Veldrid also doesn't expose that. Your best bet is a custom renderer.

This will be quite problematic then, but I guess that is my problem. I guess I will have to hack through Veldrid haha

frenzibyte commented 2 years ago

@smoogipoo Regarding custom renderers, consumers currently cannot run their games on a custom IRenderer implementation without replacing Host.GetSuitableHost with custom GameHost implementations for each platform.

It's somewhat low priority, but would like to know your opinion on moving CreateRenderer from GameHost to Game to make that easier for consumers API-wise.

smoogipoo commented 2 years ago

As I said in my post above, it could be made overrideable but I also foresee us having a renderer preference list. It would likely be in the HostOptions parameter, though.

It does not need to be addressed immediately, priority should be given to implementing Veldrid to understand if such a preference list is needed.