Closed joel-simp closed 5 months ago
4-A10 GPU each with 24GB Cuda version: 12
No response
I am trying to load cogvlm 4-bit from snapshots, but I am getting the error that there is no visual.py file. However the code works when I give hugging-face path.
How to load the model from local?
How to load the model from local or snapshots?
System Info / 系統信息
4-A10 GPU each with 24GB Cuda version: 12
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
Reproduction / 复现过程
I am trying to load cogvlm 4-bit from snapshots, but I am getting the error that there is no visual.py file. However the code works when I give hugging-face path.
How to load the model from local?
Expected behavior / 期待表现
How to load the model from local or snapshots?