Open changg10 opened 1 month ago
And I often have this warning, please how to solve it
zarr distributed checkpoint backend is deprecated. Please switch to PyTorch Distributed format (
torch_dist).
This issue is stale because it has been open for 30 days with no activity. Remove stale label or comment or this will be closed in 7 days.
Keep reporting errors
I am using two NVIDIA RTX 5880 48G GPUs, each with 32GB of memory. When I use only one GPU, the memory is fully utilized. However, when I use both GPUs, only one GPU's memory is fully utilized, while the other GPU seems to be underutilized. Why is this happening? Additionally, why does data type conversion consume so much GPU memory?