issues
search
ZJUICI
/
vllm
A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
0
stars
1
forks
source link
feat: support for TableGPT multimodal model
#4
Closed
zTaoplus
closed
2 months ago