db0 / fedi-safety

A script that goes through a lemmy images in storage and tries to prevent illegal or unethical content
GNU Affero General Public License v3.0
63 stars 6 forks source link

GPU requirements? #1

Closed poVoq closed 10 months ago

poVoq commented 10 months ago

Could you be a bit clearer about the GPU requirements of this?

Is CUDA needed, or would OpenCL suffice?

Can it work on ROCm with AMD GPUs?

Would an intel integrated GPU maybe be sufficient?

How much vRAM would be typically required?

Thanks!

tazlin commented 10 months ago

For Nvidia, it has to be supported by https://pytorch.org/, so I would expect only CUDA enabled cards will work. ROCm should also work, but is untested.

An integrated GPU is not supported. This is a limitation of pytorch.

As for VRAM requirements, the model only takes ~2-3 GB when loaded, and I would expect even a 4gb VRAM (CUDA enabled) card would suffice, perhaps somewhat slowly, but your mileage may vary. A CPU would also work as a last resort, but would take orders of magnitude longer (minutes vs seconds).

poVoq commented 10 months ago

Thanks... I guess if this will work with something other than object storage in the future, I'll give it a try on a Nvidia 970m with 3GB vRAM.

db0 commented 10 months ago

I plan to add support for local storage, but I don't have an easy way to test as I only use object storage

jeena commented 10 months ago

I think you should add to the README that it does not work on integrated GPU, my guess is that most of the people run some kind of a laptop with a integrated GPU so they will install everything and then hit UserWarning: User provided device_type of 'cuda', but CUDA is not available. Disabling like I did.