Closed yu1chaofan closed 1 month ago
I wonder if you can submit the issue to https://github.com/intel-analytics/ipex-llm/issues
OK, but I have successfully run MiniCPM-2B-sft-bf16 according to the following steps https://github.com/intel-analytics/ipex-llm/tree/main/python/llm/example/GPU/HuggingFace/LLM/minicpm
Describe the bug
Error loading model
Is there an existing issue for this?
Reproduction
Tested model: qwen1.5-0.5B phi-1_5 MiniCPM-2B-sft-bf16
Screenshot
No response
Logs
System Info