Open jombo23 opened 1 year ago
@jombo23 I'm kinda amazed nobody at AMD bothered to answer this, because it's an easy one and an obvious bug. You already know I'm sure but you're stuck rendering at fp32 unless they fixed this and didn't bother replying.
Because of the binary representation of 16 bit IEEE floating point, it's not able to reproduce all integers. 1024 is when it loses the ability to produce x.5 values but can still produce integers. That will possibly start to affect screen space antialiasing. At 2048 it can only represent every other integer, so you get the 2-pixel wide smeared things you're seeing.
There's no way around this on your end, I'd consider it a bug in the renderer for screen space like this since a scaling factor can be used to map the screen space into a more usable range of fp16 like is usually done with mapping. Denoising shouldn't rely on anything specific to screen location beyond knowing where to look for surrounding pixels, in 3 dimensions if it's good. Actually if it's doing this it's probably just using the location as an index register. Somebody should tell them they could use an integer for that if their algorithm relies on discrete integers and they wouldn't lose any speed. I once heard tell that if you wanted to use a half float type for the double speed it gets in your AMD GPU program and knew you might run up past the fairly low range it has discrete values in, you could convert it to a single before then and keep going, but that was probably just a mean bunch o' tall tales meant to scare the weak. Use one of those fp8 types to index the screen and textures by pixel, I just dare you. :-) In fact, leapfrog nvidia and make an fp4 type!
If you rendered it out past 4096 it would start being 4 pixels wide, 8192 8 pixels, etc. This is a common trait of all floating point numbers at some point in their usable range. Most programs remap things internally to avoid this. Prorender seems to do this because ultra high res textures seemingly aren't broken.
Blend file attached shadertestsproender.zip
When "use 16-bit compute" is enabled in scene->viewlayer->RPR Denoiser (Machine Learning), the denoiser starts to produce artifacts at around an x coordinate of 2050. (The scene is 2304x512px)
In example ive given, frame 54 is being rendered with 32 samples
Screenshots attached, view the righthand side of the image
"Use 16-bit compute" off:
"Use 16-bit compute on:
Sorry for any formatting errors, please let me know if I missed something or if you have any questions.