baku89 / ISF4AE

After Effects Plug-in enabling to use GLSL written in ISF as an effect
MIT License
233 stars 17 forks source link

Crash when MFR enabled and glBindBuffer() Issues on Windows Version #26

Open timurco opened 11 months ago

timurco commented 11 months ago

Hello everyone! I decided to create a separate Issue, which was previously mentioned in the Enhancement regarding Windows in this issue. Now there is a Windows version, but it seems to work with certain limitations, apparently related to issues with multithreading in OpenGL, as mentioned here.

When we simply move the sliders or the playhead in the composition, everything is fine. Also, if we turn off the MFR mode in the settings, the plugin's output renders perfectly.

image

However, with MFR mode enabled, a crash occurs because the cpuBackingPtr is found to be Null. I've reflected this more explicitly in this assert inside SmartRender at this line:

if (!outputImageCPU->cpuBackingPtr) {
      err = PF_Err_OUT_OF_MEMORY;
      // A more explicit assert that appears with the new versions 
      // of the Nvidia Studio driver described in this post:
      // [link](https://github.com/baku89/ISF4AE/issues/19#issuecomment-1724631129)
      assert("FATAL! CPU backing pointer is NULL! Cannot copy CPU buffer to EffectWorld!" && false);
    }

But there are some peculiarities, many of which I mentioned in the comments to that issue. Although I've now found that the driver version doesn't actually affect the issue. During rendering or background frame calculation, cpuBackingPtr will always be NULL.

Also, the scene behaves very strangely if, for some reason, you don't do this line before compiling it:

// Without this USELESS variable I'm getting a glitch, where the scene
// doesn't work without any errors
// FIXME: Dig into it to figure it out the reason
ISF4AESceneRef useless = VVISF::CreateISF4AESceneRefUsing(globalData->context->newContextSharingMe());

In doing so, you actually need to assign this variable; otherwise, it won't fix the problem. I made these useless variables in a separate commit.

Without it, the screen will sometimes be black, and the gl2aeScene won't work either. There's still more to figure out with this issue, and it may not be as critical if it isn't related to the main problem causing the crash.


Possible Cause

Firstly, for what it's worth, the context on Windows is initialized slightly differently, and I did this based on intuition and from examples; perhaps I did something wrong in GlobalSetup:

#ifdef _WIN32
  GLContext::bootstrapGLEnvironmentIfNecessary();
  // VVGL on Windows Doesn't have pixel format functions
  globalData->context = VVGL::CreateNewGLContextRef(nullptr, nullptr);
#else
  globalData->context = VVGL::CreateNewGLContextRef(NULL, CreateCompatibilityGLPixelFormat());
#endif
  VVGL::CreateGlobalBufferPool(globalData->context->newContextSharingMe());

globalData->downloader = VVGL::CreateGLTexToCPUCopierRefUsing(globalData->context->newContextSharingMe());
globalData->uploader = VVGL::CreateGLCPUToTexCopierRefUsing(globalData->context->newContextSharingMe());

Here the global pool is initialized, which is subsequently used in the downloader and uploader, and I think the problem lies therein. Because later on, when we copy data from CPU to GPU through the uploader in the SmartRender function, here:

ERR(uploadCPUBufferInSmartRender(globalData, in_data->effect_ref, extra, checkoutIndex, layerSize, image));

Further in this function here:

auto imageAE = globalData->uploader->uploadCPUToTex(imageAECPU);

And then further down the calls, we come to this function: GLCPUToTexCopier::_beginProcessing inside of which we fail to bind the buffer:

glBindBuffer(inPBOBuffer->desc.target, inPBOBuffer->name);
if (glGetError() != 0) {
  assert("Cannot Bind to Buffer" && false);
}

All I found about this is a similar issue. But I'm not particularly knowledgeable about working with glew, and maybe @baku89 has an idea why, for instance, our buffer gets cleared through VVGL::GetGlobalBufferPool()->housekeeping(); in the renderISFToCPUBuffer function and not, say, through purge()? Or why does the cleanup happen here and not at the end of SmartRender()? Spoiler: I tried changing and moving them, it doesn't make any difference.