Open valhassan opened 3 months ago
Multi-GPU issue
The current issue for support of multi-GPU is the fact that model never gets mapped to any other available devices and always it is mapped to cuda:0. It might be something coded in model that if cuda is available always map to cuda:0, so the setting of the model should get changed to support multi-gpu.
How to check: 1- Download https://github.com/MarjanAsgari/geo-inference-dask/blob/local_run/geo_inference/test_model_device.py to your device. 2- Have a HPC node with multiple gpu available. 3- Run the test_model_device.py.
This script is trying to map the model to cuda:1 but what you should see is
@valhassan @jfbourgon @mpelchat04
I created an issue related to this on the Model repository
Currently, geo-inference only supports the use of a single GPU. I want to support the use of multiple GPUs to increase inference speed.