Closed deranen closed 3 years ago
Hi there. The AI denoiser does use 4GB memory by minimum.
Hi @bsavery. Thanks for the answer. Is there a way to control max memory used? The thing that is disappointing is that it used to work before.
@deranen as Brian explained we limited to 4GB in FP32 to run the denoiser. We had to introduce that limit as it does indeed require 4GB or more, depending on the input image size. If windows 10 allows to over commit memory, it is actually bad practice as the system will likely be unstable especially if others processes run. And that's the main reason why we introduced that limit. Moreover 4-5 years old mid range card all have more than 4GB
That said we now also support FP16 inference, which will reduce the memory usage by 2x. To that matter you need to use that API rifImageFilterSetComputeType https://github.com/GPUOpen-LibrariesAndSDKs/RadeonImageFilter/blob/master/include/RadeonImageFilters.h#L925 with the flag RIF_COMPUTE_TYPE_HALF.
However, AFAIR Tahiti(the chipset on your R9) doesn't support half. It is an old card not really designed for inference But it worth trying as I haven't worked on Tahiti for a while
We do have an environment variable to change the limit, and I can share it with you. But if you get issues because of it, we will unlikely give support.
Finally we have others ideas to reduce the memory usage of our models. But it will take some time to get them on production.
@BenjaminCoquelle Thank you for the answer. I will try your suggestions, but I also understand that you can't fully support older cards. This is not a bug, so I will close this issue.
Hello,
This used to work, but after updating to latest SDK/binaries I get the following output:
I'm running Windows 10 with a Radeon, GeForce and integrated Intel GPU. The AMD GPU is a Radeon R9 280X.
Let me know if you need more info.