-
### The model to consider.
https://huggingface.co/THUDM/glm-4-9b-chat
### The closest model vllm already supports.
chatglm
### What's your difficulty of supporting the model you want?
_No respons…
-
### System Info / 系統信息
操作系统=ubuntu20.04
显卡=4 x v100
model=glm-9b-chat
python=3.11.8
llama-factory=0.8.3
transformers=4.43.3
vllm=0.5.3.post1
### Who can help? / 谁可以帮助到您?
_No response_
### …
-
什么时候能支持GLM-4-FLASH,这个不是较好的免费模型嘛,应该支持一下
plhys updated
2 weeks ago
-
请问什么时候能够增加glm-4模型的调用?
-
(openvino) [root@localhost openvino]# python3 convert.py --model_id /root/Glm4/GLM-4/glm-4-9b-chat/ --precision int8 --output /root/Glm4/Intel/glm-4-9b-chat-ov
====Exporting IR=====
Traceback (most…
-
GLM_4("GLM-4"),
GLM_4V("glm-4v"),
GLM_4_Air("glm-4-air"),
GLM_4_AirX("glm-4-airx"),
GLM_4_Flash("glm-4-flash"),
GLM_3_Turbo("GLM-3-Turbo");
-
**Context and question**
Hi, I am using Biomod2 (version 4.2-5-2) to run some species distribution models and I have a problem with BIOMOD_EnsembleForecasting. The previous steps work fine, but appar…
-
### System Info / 系統信息
ubuntu 22.04
### Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?
- [ ] docker / docker
- [X] pip install / 通过 pip install 安装
- [ ] installation from sour…
-
### System Info
Traceback (most recent call last):
File "/usr/local/bin/trtllm-build", line 8, in
sys.exit(main())
File "/usr/local/lib/python3.10/dist-packages/tensorrt_llm/commands/build.p…
-
### Motivation
我想用lmdeploy运行glm-4-9b,目前似乎还不支持
### Related resources
https://github.com/THUDM/GLM-4
https://modelscope.cn/models/ZhipuAI/glm-4-9b
https://modelscope.cn/models/ZhipuAI/glm-4-9b-chat…