THUDM / ChatGLM-6B

ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
Apache License 2.0
40.75k stars 5.22k forks source link

langchain-ChatGLM微调错误 #1349

Open darkneu opened 1 year ago

darkneu commented 1 year ago

Is there an existing issue for this?

Current Behavior

报错为 load_model_config modle\chatglm-6b-int4... Loading modle\chatglm-6b-int4... No compiled kernel found. Compiling kernels : C:\Users\24507.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\quantization_kernels_parallel.c Compiling gcc -O3 -fPIC -pthread -fopenmp -std=c99 C:\Users\24507.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\quantization_kernels_parallel.c -shared -o C:\Users\24507.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\quantization_kernels_parallel.so Load kernel : C:\Users\24507.cache\huggingface\modules\transformers_modules\chatglm-6b-int4\quantization_kernels_parallel.so Setting CPU quantization kernel threads to 8 Using quantization cache Applying quantization to glm layers Loaded the model in 5.40 seconds. Backend TkAgg is interactive backend. Turning interactive mode on.

module 'models' has no attribute 'ChatGLM' File "E:\astrochat\langchain-ChatGLM-master\models\shared.py", line 41, in loaderLLM provides_class = getattr(sys.modules['models'], llm_model_info['provides']) File "E:\astrochat\langchain-ChatGLM-master\webui.py", line 106, in init_model llm_model_ins = shared.loaderLLM() File "E:\astrochat\langchain-ChatGLM-master\webui.py", line 333, in model_status = init_model() AttributeError: module 'models' has no attribute 'ChatGLM'

请问应该怎么解决?

Expected Behavior

No response

Steps To Reproduce

*

Environment

- Python:3.10.12
- CUDA Support () :True

Anything else?

No response

bdd-nudt commented 11 months ago

我也有一样的问题,怎么解决啊

darkneu commented 8 months ago

我也有一样的问题,怎么解决啊

我安装了linux系统在上面运行就没有问题