vllm-project / vllm

A high-throughput and memory-efficient inference and serving engine for LLMs
https://docs.vllm.ai
Apache License 2.0
26.48k stars 3.88k forks source link

Does vllm support CodeGeeX2-6b #683

Closed aphrodite1028 closed 6 months ago

zhuohan123 commented 1 year ago

CodeGeeX2-6b is based on ChatGLM2. There are some PRs #649 #635 working on this. Should be supported after these PRs being merged.

aphrodite1028 commented 1 year ago

CodeGeeX2-6b is based on ChatGLM2. There are some PRs #649 #635 working on this. Should be supported after these PRs being merged.

thanks for your reply

hmellor commented 6 months ago

Closing as ChatGLM2 is now supported.