-
### Tested versions
Godot v4.4.dev (c6c464cf9)
### System information
Godot v4.4.dev (c6c464cf9) - Windows 10.0.22631 - Multi-window, 1 monitor - Vulkan (Forward+) - dedicated NVIDIA GeForce RTX 40…
-
optimizer: SGD with parameter groups 57 weight, 60 weight (no decay), 60 bias
DP not recommended, instead use torch.distributed.run for best DDP Multi-GPU results.
See Multi-GPU Tutorial at https://…
-
-
I use 8 gpus for infenrence, but it is slowly than 1 gpu......
I changed 'gpu_use = 0' to 'gpu_use = 0,1,2,3,4,5,6,7' in retinanet_inference_example.py.
Is it right?
-
Was compiling the code and found that SpatialLogSoftMax has been added by torch. This configuration doesnt work now as the class names are the same. I looked at the code and got it to run, but just a …
-
Hello, I've been working on parallel training using the new multi-GPU features. I noticed that for certain code it seems to run fine the first time, but in any subsequent runs I get an "invalid resour…
-
I'm trying to train dope with script at train2.
But my computer has only one GPU(RTX-3050).
Do I must use multi gpus??
Is there any way to train dope with one GPU?
-
### System Info
### System Info
### Environment Details:
- **`trl` version**: `0.11.1`
- **`transformers` version**: `4.45.1`
- **Python version**: `3.10.11`
- **Operating System**: `Linux 4…
-
As per discussion in https://github.com/JuliaGPU/CuArrays.jl/issues/52, that I report here, the current way to use a specific gpu in a Flux script would be the following
```julia
gpu_id = 0 ## set …
-
The code seems to not support multi-gpu training. Although I find that the code has some parts to support it , it seems to not work. Can you fix it?