-
您好,请问可以提供一下chatglm-6b这个模型的文件吗?联系方式:1970198034@qq.com
-
-
### System Info
```shell
(habanalabs-venv) (habanalabs-venv) root@vmInstancetmhmacpr:~/lier_workload/test# python inference_test.py
/usr/lib/python3.10/inspect.py:288: FutureWarning: `torch.distrib…
-
![image](https://user-images.githubusercontent.com/10215059/231936417-5b8e4c00-57e2-408b-baa1-18183730212c.png)
运行 /ChatGLM-6B/textgen/examples/chatglm$ python predict_demo.py 报错,glm6B模型用的是 原…
-
Hi
I have a mistake when using the "model.chat()" function. here are some pictures about it
there may be a mistake happend in " ~/.cache/huggingface/modules/transformers_modules/visualglm-6…
-
1. https://github.com/THUDM/ChatGLM-6B/blob/main/README_en.md
> ChatGLM-6B uses technology similar to ChatGPT, optimized for Chinese QA and dialogue. The model is trained for about 1T tokens of Chine…
-
Hi,Is the __GLM model__ in chatglm-6b pretrained on this repo by just modifying some tricks (such as add rope embedding, use new tokenizer), or use the GLM-130b based repo which can use megatron model…
-
我将chatglm2下载到如下图的目录
![image](https://github.com/user-attachments/assets/cabf6b33-475a-4783-b74e-7bb26fdc1088)
但报错说没有指定的模型
-
## 问题描述 / Issue Description
请简要描述您遇到的问题。 / Please briefly describe the issue you encountered.
本地/root/ChatGLM目录下载的ChatGLM2-6B模型,
使用vllm部署server:
vllm serve /root/ChatGLM --chat-template ./…
-
### Is your feature request related to a problem? Please describe.
按照Readme的描述使用CPU推理ChatGLM2-6B-int4量化版本时报错,报错信息如下:
![image-20230721201053533](https://github.com/THUDM/ChatGLM2-6B/assets/57085345…