Closed regcs closed 3 years ago
Further debugging: The bug can be resolved by using numpy arrays with 4 color channels instead of 3 for the offscreen rendering. Do the procedures used to copy pixel data from the OpenGL texture to the numpy array expects a RGBA buffer in Blender? If so, this is a Blender limitation.
At the moment, I don't want to use RGBA numpy arrays because that slows down the live view again. But it seems to be the only way at the moment. Will do some research later.
I still believe this is a Blender bug or limitation. Filed a bug report to the Blender community: https://developer.blender.org/T91828
The lightfield viewport / "live view" now uses RGBA images again, which fixes the bug. However, we use OpenCV now to convert the quilt from RGBA or BGRA to RGB, which is only connected to a minor performance loss.
For the sake of completeness and later updates:
The issue actually was that OpenGL expects byte rows of an image to be divisible by 4 as a default setting. For some quilts (e.g., 45 views, 4096x4096) the single views (819x455) did not fulfill this prerequisite and, therefore, Blender crashed. The value of GL_PACK_ALIGNMENT
is now by default set to 1 in Blender with this commit by Germano Cavalcante:
For specific combinations of display selections and quilt presets, Blender crashes as soon as the lightfield window is opened.
So far, I was able to traceback the error to the point where the pixel data is read from the OpenGL framebuffer to the numpy array. However, still unclear why that happens sometimes. Could be related to some internal memory management issues.