Closed Peyton-Chen closed 6 months ago
Hi @Peyton-Chen ,
The inference of SVD requires <13G memory. You can try this command:
python main.py --model_type svd
Thank you for your response! I've successfully resolved the issue. It appears that the root cause was related to my package version.
One possible cause for this issue:
If the error RuntimeError: expected scalar type float but found c10::Half
occurs when running SVD with fp16, or the out-of-memory problem occurs after switching to fp32, please refer to the answer under this issue and increase the version of torch to 2.X
In my SVD experiment, OOM occurred on the 80G A100. So I would like to ask how much GPU memory can support SVD experiments. Thanks!