Open Transformer-man opened 3 months ago
Do you check the gpu_mapping.yaml file?
Considering you have only one GPU, you should change line 13 to 4090: [6].
Make sure you have enough GPU memory to run 6 clients in one card. For your reference, each client will need at least 2GB GPU memory.
Hello, when I used a single card 4090, an error occurred, RuntimeError: cuda error: invalid device ordinal