Closed hldh214 closed 11 months ago
Hello! Thank you very much for your interest in the library, and the excellent report!
You are actually right! In fact when I designed the high-level API, I only thought about single calls as I believe the for-loop use cases would be better achieved with the lower-level approach. Having said that, there is also no reason why the high-level API could not support for-loop cases as well.
I shall improve this, probably by allowing the users to inject an instantiated model object to the function, so that the model does not need to be re-created every time. Basically the same concept as you suggested.
Thanks again!
Hello!
Please check out the latest release (0.12.0
), in which an internal global model will be used to avoid re-creating the model object every time, i.e., similar to what you have tried out. Kindly let me know if you find any issues.
Thanks!
Indeed it works, thanks again!
Hello, and first of all, thank you for this fantastic project; I truly appreciate it.
I've encountered what may be a minor issue when using
predict_images
in a for-loop, which results in an Out Of Memory (OOM) error on my 4GB RAM machine. After investigating the code, I found the following line:https://github.com/bhky/opennsfw2/blob/9654c1e9238307d087066891c70d55bbb1df3951/opennsfw2/_inference.py#L31-L31
It appears that the model is created anew with every call to
predict_images
, which could lead to memory leaks, as suggested by my testing. To address this, I attempted to move the model creation outside the function like this:Then, I utilized the global model within the function, as shown below:
This change seems to have resolved the memory issue on my end.