ocornut / imgui

Dear ImGui: Bloat-free Graphical User interface for C++ with minimal dependencies
MIT License
61.15k stars 10.3k forks source link

WGPU backend: where are device required limits specified? #7367

Open ezorzin opened 8 months ago

ezorzin commented 8 months ago

Version/Branch of Dear ImGui:

Version 1.90.4, master

Back-ends:

imgui_impl_wgpu.cpp + imgui_impl_glfw.cpp

Compiler, OS:

macOS 11.7.10, VSCode + Clang

Full config/build information:

No response

Details:

Hi,

I am experimenting IMGUI + WEBGPU (C++ bindings) on a mac. I am learning WEBGPU from: https://eliemichel.github.io/LearnWebGPU/basic-3d-rendering/some-interaction/simple-gui.html. The examples they provide are a little bit outdated from the point of view of IMGUI, but with minor modifications I have been able to make it work.

Everything is fine, I am able to create a small "Hello World" IMGUI panel on top of my graphics shader. But I have one problem: when requiring the limits for requesting the WEBGPU device, I don't know what are the resources that IMGUI needs to have.

In my code, when doing:

device_descriptor.requiredFeaturesCount = 0;
device_descriptor.requiredLimits        = &required_limits;
device_descriptor.defaultQueue.label    = "The default queue";
device  = adapter.requestDevice (device_descriptor);

I have to comment the line: device_descriptor.requiredLimits = &required_limits; in order to suppress the limits request (hence using whatever resources are available on the graphics adapter) in order to make it work. Otherwise my code crashes by a segmentation fault.

I am speaking of the device limits like: maxVertexAttributes maxVertexBuffers maxBufferSize ...etc

My question is: regarding to ALL those parameters, how do I know what to require to my device from the point of view of what IMGUI needs?

At the moment, by trial and error, I kind of figured out a minimal set of values which makes my code working but I am not sure whether this minimal set is good for possible things I could do with IMGUI or not: for all other data structures of mine I can exactly estimate what are the required resources.

Could you please clarify the list and values of all these device requirements?

Thanks

Screenshots/Video:

No response

Minimal, Complete and Verifiable Example code:

// Here's some code anyone can copy and paste to reproduce your issue
ImGui::Begin("Example Bug");
MoreCodeToExplainMyIssue();
ImGui::End();
ezorzin commented 8 months ago

P.S: please ignore the Minimal Example, I just forgot to delete the default code before submitting this issue :)

GamingMinds-DanielC commented 8 months ago

Vertices with a position, one set of texture coordinates and a single color are enough for ImGui. A single bound texture is also all that is needed. The library is hardware agnostic, in a custom backend you could even implement a fixed function pipeline if you wanted to.

How big the buffers need to be depends entirely on what you want to do, more complex UIs generally need more vertices up to a certain point (invisible items get clipped out). On the other hand there is no upper bound, f.e. with draw lists and custom draw commands you can exceed any limit. There is also no minimum size for textures that is needed, a very small texture is enough if you don't need many fonts or extensive glyph ranges. And if you go overboard on that end you could make it so that even 16k textures won't be enough.

Short answer: no hard limits, everything depends on how you use it.

ezorzin commented 8 months ago

So, how can I avoid memory leaks? How can I estimate what I can draw with IMGUI before actually drawing it and eventually crash the program?

GamingMinds-DanielC commented 8 months ago

It is not a memory leak if you allocate a buffer that is bigger than it needs to be. Memory leaks are when you f.e. grow a buffer by allocating a bigger one and replacing the old one while forgetting to release the old one. You can avoid leaks by using managed containers and by being careful when you need to do manual memory management. But this is not the right place to learn memory management, this is intended for ImGui specific issues.

As I said earlier, it is entirely up to your application how big the buffers need to be. If you want to estimate in advance, test with big buffers and keep track of their utilization, then reduce to what you need with some security. Or you just don't do that and implement buffers that grow as needed in your custom backend. Take a look at existing backends for reference.

ezorzin commented 8 months ago

Quoting you: "It is not a memory leak if you allocate a buffer that is bigger than it needs to be." Maybe "memory leak" is not the appropriate word, anyway my problem is exactly this: how can I know how big the buffers for IMGUI need to be?

I know my buffers, and in general all other resources I use. What I don't know is what IMGUI takes.

For instance here https://eliemichel.github.io/LearnWebGPU/basic-3d-rendering/some-interaction/simple-gui.html they say:

requiredLimits.limits.maxBindGroups = 2;
//                                    ^ This was a 1

Just as an example, I tried and then I discovered that resource requirement is necessary but not sufficient. Are there other things like that, which one should know in advance? I understand one can always start from the top and going down, as you suggests, which is basically what I am doing now.

But before continuing with this approach I just wanted to be sure whether I can do it, somehow else, more systematically or not.

GamingMinds-DanielC commented 8 months ago

You can use ImGui::ShowMetricsWindow() to see exactly how much geometry gets generated for your UI. But there is no guarantee that these values stay roughly the same from version to version. If some drawing code f.e. gets updated with a better version, the amount of geometry generated might change. It also very much depends on your settings and your use of the library, so measuring yourself is the only thing you can do.

To not exceed the buffers, you then need to safeguard your custom backend for cases where the amount of geometry generated is too much for your buffers to hold. But the trouble really isn't worth it. If you just implement growable buffers and grow them as needed, you won't run into problems. The amount of geometry generated is low enough that even with high utilization it won't be a bottleneck, except maybe in scenarios that are specifically constructed to generate huge amounts of geometry.

ezorzin commented 8 months ago

Ok, thanks!

ocornut commented 8 months ago

Could you please clarify the list and values of all these device requirements?

The code in imgui_impl_wgpu.cpp is fairly straightforward you can read it and understand what it needs.

But I have one problem: when requiring the limits for requesting the WEBGPU device, I don't know what are the resources that IMGUI needs to have.

I'm not sure you are trying to solve a real problem here, more like a theoretical problem. Why requesting a limit? What are you going to do if the limit isn't supported by the device? vs What is going to happen if the limit is not checked but exceeded? My assumption is that you shouldn't bother with requesting that minimum at that point in your development.

ocornut commented 8 months ago

If it turns out that those requested limits are really meaningful in WGPU ecosystem I assume we could expose a function to expose the min-max desired by backend, and you can start constructing it by looking at the backend code.

ezorzin commented 8 months ago

I am not expert about WEBGPU, as far as I read from the links I already cited, they say the concept of "device" an "adapter" are there exactly for mitigating the problem "it worked on my PC (and not on your, maybe)". It is therefore suggested to estimate the resources your code needs and create a device using those resources, instead of using all you have on the physical adapter.

ezorzin commented 8 months ago

P.S. this: https://eliemichel.github.io/LearnWebGPU/getting-started/the-device.html is the place where I read about the idea of "device" and "limits" vs "adapter". Following that link, they develop a code step-by-step in WGPU by making it more and more elaborated as the explanation proceeds.