Closed TYang92677626 closed 3 months ago
一样,有没有最低的显存占用标准?
You may try using single card - I benchmarked using a slice of A100 80GB (about 40192MiB slice), and it took me ~20.18 GB to run the exact same command as yours, i.e. torchrun --nproc_per_node=1 scripts/inference.py configs/opensora-v1-1/inference/sample.py --prompt "A beautiful sunset over the city" --num-frames 16 --image-size 240 426
.
The benchmark script is at https://github.com/JThh/Open-Sora/blob/benchmem/scripts/inference.py.
那么,显着降低消耗标准?
不太明白是什么意思
This issue is stale because it has been open for 7 days with no activity.
The above is the memory requirement for OpenSora 1.2. And at OpenSora 1.1, sequence parallelism is not supported. Please try with OpenSora 1.2.
use 4 rtx4090, always out of memory: CUDA_VISIBLE_DEVICES=0,1,2,3 torchrun --nproc_per_node 4 scripts/inference.py configs/opensora-v1-2/inference/sample.py --num-frames 4s --resolution 480p --aspect-ratio 9:16 --prompt "a beautiful waterfall"
I am using 4 GPUs (Quadro RTX 6000 24G) for reasoning, and I keep reporting insufficient GPU memory. It keeps saying that the GPU space is insufficient. I found that I was missing dozens of MB when using 4 GPUs and 1 GPU. When I used multiple GPUs, it still didn't solve the problem of insufficient memory. My batch_size is 1, and other parameter settings are also reduced accordingly, but it still says that the GPU space is insufficient. Does Open-Sora require that a single GPU must have enough space? Has anyone encountered the same problem? How to solve it? (我正在用4块GPU(Quadro RTX 6000 24G)显卡来推理,一直报GPU内存不足。 一直提示GPU空间不够,我用4块GPU和1块GPU发现都提示缺少几十Mb。我的batch_size是1,其他参数设置都相应地减少了,仍然报GPU空间不足。是单块GPU必须有足够大的空间吗?有遇到同样问题的吗?怎么解决呢?)
Below is the command to run inference(下面是运行推理命令):