Closed traversaro closed 2 months ago
INTEGRATION_gpu_rays_ogre2_gl3plus
failure~/.gz/rendering/ogre2.log
fileINTEGRATION_heightmap_ogre2_gl3plus
~/.gz/rendering/ogre2.log
file INTEGRATION_lidar_visual_ogre2_gl3plus
INTEGRATION_projector_ogre2_gl3plus
UNIT_Utils_TEST_ogre2_gl3plus
The heightmap sample doesn't say why it's failing but I have two very strong guesses:
#version 430
and based on PBS shaders' output from WSLg, GL 4.3 is not supported. The Ogre.log also says the OpenGL context version is 4.1GL_ARB_compute_shader
and this extension was made mandatory in core GL 4.3. Which suggests compute shaders are not supported by WSLg's current driverThe compute shader in heightmap is used for terrain shadows. In theory it could be possible to skip this (at the cost of not having terrain shadows), but supporting such path has its development cost.
It looks like WSLg may not support Compute Shaders at all: The Ogre.log doesn't show the extension
GL_ARB_compute_shader
and this extension was made mandatory in core GL 4.3. Which suggests compute shaders are not supported by WSLg's current driver
Interestingly, it seems that GL_ARB_compute_shader
should be supported by d3d12 : https://gitlab.freedesktop.org/mesa/mesa/-/blob/22.2/docs/features.txt?ref_type=heads#L174 , perhaps there is something strange on my testing setup.
Interestingly, it seems that GL_ARB_compute_shader should be supported by d3d12 perhaps there is something strange on my testing setup.
Quickly skimming through ARB_compute_shader shows "OpenGL 4.2 is required."
Thus it makes sense that the extension won't be exposed.
However that same document says OpenGL 4.2 is also done for d3d12.
Try seeing if glxinfo reports GL 4.1 or 4.2
If glxinfo reports 4.1, there's something wrong in your setup.
If it reports 4.2, then try launching gazebo with env variable LD_DEBUG=libs
to double check the right GL libraries are being loaded (compare it against what glxinfo loads).
If both glxinfo and gazebo load the same GL libraries but one creats a 4.2 context and the other 4.1; then that's likely a weird Ogre bug.
Ok, some more debugging.
INTEGRATION_projector_ogre2_gl3plus
This test works fine by updating mesa (and hence d3d12) from 22.2.5 to 23.1.0 .
INTEGRATION_heightmap_ogre2_gl3plus
For this one, I understood why I got OpenGL 4.1 . Basically I have two GPUs on my system, an Intel one and a NVIDIA one. By default d3d12 picks the Intel one, that returns OpenGL 4.1 (probably due to underling limitation of the GPU or of the driver). By selecting NVIDIA, I am able to get OpenGL 4.2 :
traversaro@IITICUBLAP257:~7$ MESA_D3D12_DEFAULT_ADAPTER_NAME=Intel glxinfo | grep version
server glx version string: 1.4
client glx version string: 1.4
GLX version: 1.4
Max core profile version: 4.1
Max compat profile version: 4.1
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.0
OpenGL core profile version string: 4.1 (Core Profile) Mesa 22.2.5
OpenGL core profile shading language version string: 4.10
OpenGL version string: 4.1 (Compatibility Profile) Mesa 22.2.5
OpenGL shading language version string: 4.10
OpenGL ES profile version string: OpenGL ES 3.0 Mesa 22.2.5
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.00
traversaro@IITICUBLAP257:~$ MESA_D3D12_DEFAULT_ADAPTER_NAME=Intel glxinfo | grep GL_ARB_compute
traversaro@IITICUBLAP257:~$ MESA_D3D12_DEFAULT_ADAPTER_NAME=NVIDIA glxinfo | grep version
server glx version string: 1.4
client glx version string: 1.4
GLX version: 1.4
Max core profile version: 4.2
Max compat profile version: 4.2
Max GLES1 profile version: 1.1
Max GLES[23] profile version: 3.1
OpenGL core profile version string: 4.2 (Core Profile) Mesa 22.2.5
OpenGL core profile shading language version string: 4.20
OpenGL version string: 4.2 (Compatibility Profile) Mesa 22.2.5
OpenGL shading language version string: 4.20
OpenGL ES profile version string: OpenGL ES 3.1 Mesa 22.2.5
OpenGL ES profile shading language version string: OpenGL ES GLSL ES 3.10
GL_EXT_shader_implicit_conversions, GL_EXT_shader_integer_mix,
traversaro@IITICUBLAP257:~$ MESA_D3D12_DEFAULT_ADAPTER_NAME=NVIDIA glxinfo | grep GL_ARB_compute
GL_ARB_compute_shader, GL_ARB_compute_variable_group_size,
GL_ARB_compressed_texture_pixel_storage, GL_ARB_compute_shader,
GL_ARB_compute_variable_group_size, GL_ARB_conditional_render_inverted,
By switching to NVIDIA, I still get the error:
19: C++ exception with description "OGRE EXCEPTION(3:RenderingAPIException): Compute Program 0TerraShadowGenerator failed to compile. See compile log above for details. in GLSLShader::compile at /home/traversaro/gz-ws/srctst/ogre-next/RenderSystems/GL3Plus/src/GLSL/OgreGLSLShader.cpp (line 369)" thrown in the test body.
during the tests, but I get a more clear error in the ogre2.log :
19:34:44: GLSL compile log: 0TerraShadowGenerator
0:2(10): error: GLSL 4.30 is not supported. Supported versions are: 1.10, 1.20, 1.30, 1.40, 1.50, 3.30, 4.00, 4.10, 4.20, 1.00 ES, 3.00 ES, and 3.10 ES
0:0(0): error: Compute shaders require GLSL 4.30 or GLSL ES 3.10
19:34:44: OGRE EXCEPTION(3:RenderingAPIException): Compute Program 0TerraShadowGenerator failed to compile. See compile log above for details. in GLSLShader::compile at /home/traversaro/gz-ws/srctst/ogre-next/RenderSystems/GL3Plus/src/GLSL/OgreGLSLShader.cpp (line 369)
At this point, I need to understand a bit more if my drivers on the Windows size are updated and if there is any specific limitation on my NVIDIA GPU (NVIDIA GeForce GTX 1650 Ti). I will update the issue with more details.
Ok that makes more sense.
The #version 430
may "just work" if it's changed to #version 420
. But if it doesn't, let's see what's the next error to fix :)
Honestly I think there was no driver in the world that implemented Compute Shaders without reporting GL 4.3; if there was, then it was only live for a few months a decade ago.
The
#version 430
may "just work" if it's changed to#version 420
. But if it doesn't, let's see what's the next error to fix :)Honestly I think there was no driver in the world that implemented Compute Shaders without reporting GL 4.3; if there was, then it was only live for a few months a decade ago.
Ok, I was able to get all tests to pass with the following steps.
MESA_D3D12_DEFAULT_ADAPTER_NAME=NVIDIA
to use my Nvidia card (to get GL_ARB_compute_shader
support)diff --git a/ogre2/src/media/2.0/scripts/materials/Common/GLSL/GaussianBlurBase_cs.glsl b/ogre2/src/media/2.0/scripts/materials/Common/GLSL/GaussianBlurBase_cs.glsl
index 9e013df6..5dadc8fe 100644
--- a/ogre2/src/media/2.0/scripts/materials/Common/GLSL/GaussianBlurBase_cs.glsl
+++ b/ogre2/src/media/2.0/scripts/materials/Common/GLSL/GaussianBlurBase_cs.glsl
@@ -1,5 +1,7 @@
@property( syntax != glslvk )
- #version 430
+ #version 420
+ #extension GL_ARB_arrays_of_arrays: enable
+ #extension GL_ARB_compute_shader: enable
@else
#version 450
@end
diff --git a/ogre2/src/media/2.0/scripts/materials/Terra/GLSL/TerraShadowGenerator.glsl b/ogre2/src/media/2.0/scripts/materials/Terra/GLSL/TerraShadowGenerator.glsl
index e519d273..4b0d428d 100644
--- a/ogre2/src/media/2.0/scripts/materials/Terra/GLSL/TerraShadowGenerator.glsl
+++ b/ogre2/src/media/2.0/scripts/materials/Terra/GLSL/TerraShadowGenerator.glsl
@@ -1,5 +1,6 @@
@property( syntax != glslvk )
- #version 430
+ #version 420
+ #extension GL_ARB_compute_shader: enable
#define ogre_B0 binding = 0
#define ogre_B1 binding = 1
@else
By doing that, all tests pass fine. I still have a segfault on program exit once the test pass, but this seems unrelated (see https://github.com/microsoft/wslg/issues/715).
Thanks for the detective work!
Since we try to aim to 430 for simplicity (probably 420 + extension is the exact same thing; but I'm trying to play safe), I incorporated your fixes only if 430 isn't present (which is basically WSLg drivers).
Cool, thanks! Yes that seems the right thing to do.
Another instance of #version 430
is https://github.com/gazebosim/gz-rendering/blob/gz-rendering7_7.4.0/ogre2/src/media/2.0/scripts/materials/Common/GLSL/GaussianBlurLogFilterBase_cs.glsl, but I guess it is unused in tests.
I incorporated the fixes to this issue (except for the INTEGRATION_projector_ogre2_gl3plus
that can be fixed by updating mesa) in https://github.com/gazebosim/gz-rendering/pull/851 .
thanks for investigating the issue and the fixes!
Environment
dxdiag
and report the GPU-related information.system_profiler SPDisplaysDataType
. Copy the output here.~/.gz/rendering
Description
After fixing the basic problem in https://github.com/gazebosim/gz-sim/issues/920, on a WSLg installation using the d3d12 driver, some tests of the test suite of gz-rendering are failing, while the test suite works fine if one sets
export LIBGL_ALWAYS_SOFTWARE=true
.Steps to reproduce
Run the testsuite on WSLg with d3d12 after applying the following patches:
Output
More info will be provided in the next comments.
INTEGRATION_gpu_rays_ogre2_gl3plus
,INTEGRATION_heightmap_ogre2_gl3plus
andUNIT_Utils_TEST_ogre2_gl3plus
failures are of the kindCompute Program 0TerraShadowGenerator failed to compile
, but as no shader program was placed in the log directory, I modified directly OgreNext to print the shader program:For the
INTEGRATION_lidar_visual_ogre2_gl3plus
andINTEGRATION_projector_ogre2_gl3plus
tests, the failure seems to be related to something else.