Closed Calinou closed 2 years ago
I cannot reproduce the issue with the latest version (v1.4.1
) of Open Image Denoise. In your case, the stride must be 16 bytes (4 channels * 4 bytes per channel). There's no reason to use different strides for the input and the output since these are the same buffer. What happens if you use a stride of 16? Which version of Open Image Denoise are you using?
With denoising enabled and both strides set to 16, this is the lightmap I get:
When denoising is disabled (which results in the expected appearance for the color and alpha channel), this is the lightmap I get:
The shadowmask is stored in the alpha channel, with an alpha of 0% being "fully occluded" and 100% being "not occluded at all".
Which version of Open Image Denoise are you using?
I'm using 1.1.0 because it's the last version that doesn't require ISPC to be compiled. (Apologies, but we can't use ISPC for various reasons such as the added complexity.)
It looks like it's not just the alpha channel what's wrong but the color as well. Is this indeed the case? I could not reproduce the issue with v1.1.0
either: the color channel is correctly denoised and the alpha channel is preserved. Are you sure that your format is indeed RGBA (and not ARGB) and the provided parameters are correct (e.g. do you set the stride correctly)? It seems that either the actual format is different from what is specified for OIDN, or there is some bug in the code which calls OIDN. In any case, I would need a stand-alone reproducer to be able to look into this further.
How do you disable the denoiser to get the correct second image? Do you just skip calling your oidn_denoise
function or are there other differences between the two code paths? Do you also get a correct image if you just comment out oidnExecuteFilter
but otherwise leave the denoiser enabled? I'm wondering whether enabling the denoiser triggers some additional code outside OIDN, which provides incorrect data to OIDN.
For context, I'm working on a Godot pull request that adds shadowmasking to the CPU lightmapper, which uses OpenImageDenoise. The shadowmask is stored in the lightmap's alpha channel to prevent the need to use a separate texture (which adds a whole lot of complexity). However, if I enable the denoiser, this alpha channels is lost when the image is denoised.
I've tried the following to allow preserving the alpha channel of the input image:
I'm not sure I understand what size to pass as a stride. My image data is always
Image::FORMAT_RGBAF
(floating-point RGBA with 32 bpc precision). I've tried different values such as4
,8
,12
,16
but they all resulted in an error message (LightmapDenoiser: pixel stride smaller than pixel size
), or an image that looked corrupted. I've also tried specifying different strides for the"color"
and"output"
images to no avail.Would it be possible to add a code sample to the README that describes how to preserve an image's alpha channel, assuming the image uses RGBAF format (not ARGBF)? Thanks in advance :slightly_smiling_face: