issues
search
ZJUICI
/
vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
0
stars
0
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
feat: support for TableGPT multimodal model
#5
zTaoplus
opened
1 week ago
0
feat: support for TableGPT multimodal model
#4
zTaoplus
closed
1 week ago
0
[Bug]: Background loop is stopped
#3
edwardzjl
opened
3 weeks ago
3
WIP: support table multi data base on vllm-0.5.4
#2
zTaoplus
closed
1 week ago
0
feat: api server supports prompt token ids input
#1
zTaoplus
closed
9 months ago
0