issues
search
li-plus
/
chatglm.cpp
C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4
MIT License
2.81k
stars
326
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
推理输入不同文字长度有时会出现 check failed (std::isfinite(next_token_logits[i])) nan/inf encountered at lm_logits[0]
#333
leizhu1989
opened
2 days ago
1
有关内容总结质量
#332
leizhu1989
opened
2 days ago
0
量化模型再T4服务器不能调用GPU资源,是什么原因呢?
#331
hithepeng
opened
6 days ago
0
并发问题
#330
Ab-123
opened
6 days ago
0
mac m1 pro 16G运行glm4-ggml报错
#329
lwo2002
opened
1 week ago
1
self.tokenizer is none, how to solve this problem.
#328
lzj-r
opened
1 week ago
3
can not import chatglm_cpp, add the following code to the web-demo.py
#327
lzj-r
opened
1 week ago
0
请问是否支持glm-4-9b-chat-1m模型量化
#326
leizhu1989
opened
1 week ago
3
在ARM上构建docker镜像出现异常
#325
hooploop
opened
1 week ago
0
hf上下载的gguf量化后的模型文件,运行报错
#324
vaxilicaihouxian
closed
1 week ago
1
问题:python安装v0.4.0导致卡死,有没有能控制编译资源的选项?
#323
ChengjieLi28
opened
1 week ago
0
Fix nan by rescheduling attention scaling
#322
li-plus
closed
1 week ago
0
在maos上启用metal的话似乎必须安装xcode了?
#321
mahabuta
opened
1 week ago
0
同一模型,open_api.py的显存用量显著大于cli
#320
3wweiweiwu
closed
1 week ago
1
openai server出现空回答
#319
FuturePrayer
closed
2 weeks ago
2
glm4量化之后开始胡言乱语有人遇到过吗?
#318
piaodangdang
opened
2 weeks ago
6
Disable shared library by default. Set default max_length in api server.
#317
li-plus
closed
2 weeks ago
0
libre2.so.11 not found
#316
hooploop
closed
2 weeks ago
1
chatglm_cpp-0.3.4 终于成功了
#315
sqhua
opened
2 weeks ago
0
Fix regex lookahead for code input tokenization
#314
li-plus
closed
2 weeks ago
0
使用openai_api.py启动量化后的glm4模型报错
#313
gabrielpondc
closed
2 weeks ago
4
请问是否支持了GLM-4V?
#312
yhl41001
opened
2 weeks ago
6
openai server非流式请求报错
#311
FuturePrayer
closed
2 weeks ago
6
glm4运行报错
#310
zArche
closed
2 weeks ago
1
Use apply_chat_template to calculate tokens
#309
dixyes
closed
3 weeks ago
1
Can't use web_demo in glm4
#308
jtc1246
closed
3 weeks ago
1
chatglm4: Run ./build/bin/main, Error: invalid model type 4
#307
guissy
closed
3 weeks ago
2
Update py interface
#306
li-plus
closed
3 weeks ago
0
Dynamic memory allocation. Drop Baichuan/InternLM support in favor of llama.cpp.
#305
li-plus
closed
1 week ago
0
Add ChatGLM model type & tokenizer to pybind
#304
li-plus
closed
3 weeks ago
0
Support ChatGLM4 conversation mode
#303
li-plus
closed
3 weeks ago
1
Will it support glm-4-9b-chat?
#302
okwinds
closed
3 weeks ago
1
GLM-4-9B已发布,是否有支持计划
#301
yuezhishun
closed
3 weeks ago
12
build之后,缺少main文件
#300
wwwsctvcom
opened
1 month ago
0
win11 chatglm3-6b 转换 q4_0报错
#299
S0uLHun43r
closed
1 month ago
0
pip install -U chatglm-cpp 失败: OSError: [WinError 1] 函数不正确。: 'R:\\Temp\\pip-build-env-mhz20gwx\\overlay\\Lib\\site-packages\\cmake\\data\\bin\\cmake-gui.exe'
#298
Janet-Baker
opened
1 month ago
0
[Solved] Windows 上 chatglm_cpp 模块 "DLL load failed while importing _C" 问题
#297
LTHPKBTE
opened
1 month ago
2
Separate folder for ggml models & Fix dockerfile
#296
li-plus
closed
2 months ago
0
chatglm.cpp:152 check failed (messages.size() % 2 == 1) invalid chat messages size 2
#295
zhuwensi
closed
2 months ago
2
Fix cibuildwheel name conflict
#294
li-plus
closed
2 months ago
0
上下文推理时速度会变慢,请问应该怎么解决?
#293
youranjvshi123
opened
2 months ago
2
个人理解 check_chat_messages() 应该只检查 user / assistant 消息
#292
sswater
closed
2 months ago
1
Fix cibuildwheel on GitHub Action
#291
li-plus
closed
2 months ago
0
请问可以支持ChatGlm3-6b-128K模型量化吗
#290
dfengpo
opened
2 months ago
0
Support p-tuning v2 finetuned models for ChatGLM family
#289
li-plus
closed
2 months ago
0
如何编译一个某类gpu兼容的 程序?
#288
yangliangguang
opened
2 months ago
0
error: wheels for chatglm.cpp on windows
#287
srdevore
opened
2 months ago
1
_C.pyi 的三個類成員變量是什麼意思?要怎麼填? 以及 ChatMessage裡的 tool_calls 要怎麼填
#286
Eddy-Powers
opened
2 months ago
0
关于convert.py转换出来的文件
#285
catmeowjiao
opened
3 months ago
0
关于单次最大回复值的tokens值
#284
MW-S
opened
3 months ago
0
Next