cginternals / globjects

C++ library strictly wrapping OpenGL objects.
https://globjects.org
MIT License
538 stars 59 forks source link

Deinitialize objects of examples in correct order #358

Closed Tobias1595 closed 6 years ago

Tobias1595 commented 6 years ago

Affects all examples except commandlineoutput and qtexample. Fixes crash on closing the application.

X-Ryl669 commented 6 years ago

Honestly, reading this hurts my eyes. We are using C++ here, and RAII. Isn't something flawed here that we have to deal with deinitialization ?

IMHO, in order to avoid dealing with pointers, we end up recreating a "pointer" like code (except that, instead of ptr = nullptr; line, we have the painful obj_ptr.reset(nullptr);). Wouldn't be a lot better with simple pointers ? Or, change the semantic, and say if the program does not owns the shaders by taking references instead (and then it's clear that the referred shader must outlive the program). Or, another solution, use a counting_ptr<>-like class internally that start with a unique_ptr with count = 0 and with a pointer with count = 1. In all cases, the code is not obvious.

scheibel commented 6 years ago

Our current state requires the user of globjects to perform any memory management (create and delete) that cannot be fully hidden to the user. It is permissible to create new shaders via new Shader(GL_VERTEX_SHADER) or use our own wrappers for C++11 smart pointers like Shader::create(GL_VERTEX_SHADER).

So I see your comment to focus on our use of this memory management within the examples. Honestly, the code does not look intuitive (basically the opposite, as you observed), but until know I did not model an explicit dependency graph.

When you mention the counting_ptr, I recall our vast discussions targeting our ref_ptr.

X-Ryl669 commented 6 years ago

If globjects does not do any "public" memory management (not internal I mean), then you should simplify everything with taking references in parameters. So there is no question whatsoever, and the user will be in charge of guaranteeing that the referenced object outlive globjects' objects.

However, if the interface takes a pointer (or a pseudo-pointer like unique_ptr) then it's not clear anymore who's in charge of the given object lifetime. This confusion leads to bug (that can be subtle, like this PR shows, because the order of destruction matters).

I'm afraid, there isn't a single solution for this issue, and it's more like a per-case basis. For example, when you set the texture's data, you give it a pointer to the picture's content, but you don't expect the object to use that after the call returned. In that case, it's easy.

Yet for a program, you can decide if you want to reuse your shader with another program or not and that is hard without some kind of resource management (reference counting or better, let the user deal with her own will). So, you should probably only accept references in that case (this will break all attach(new Shader()) code, but this is fortunate, since that code is probably wrong anyway. Accepting plain pointer without owning is ... weird, unless explicitly written (but this implies reading the documentation for each function call, it's a pain).

Even if you want to keep the current interface, the program should use a shared_ptr internally since it does not know if the shader will be reused or not (and accept them in the interface). That way, deletion order is not important.

In my code, I ended creating a own_ptr that stores a bool to say if the pointer is owned or not. It also support being created from reference (not owned), or pointers (owned), and allow move semantic so only a single owner can exist at a time. It's lightweight since you don't pay for the heap stored reference counter and atomic (since there is no reference counting at all). The order of destruction does not matter anymore.

scheibel commented 6 years ago

Thanks for your insights. After some discussion with one of my colleagues, we came to the conclusion that, although memory management and interface design is the topic here, the actual bug should not have been occurring. Regardless of memory management, the deletion of one object (e.g., Shader) should perform a clean deregister from a referencing object (e.g., Program) upon deletion. If this works, then no error should get thrown. Probably the defect was not the order of deinitialization / deletion but within the deregister / cleanup code. We'll investigate further.