Open mattt opened 3 days ago
I've gotten a report about Cog not working with some of the language models in this repo. In particular, llama-2-70b-chat-awq and mixtral-instruct.
Could you please consider removing these examples and/or linking instead to the new, official Cog project for vLLM?
I've gotten a report about Cog not working with some of the language models in this repo. In particular, llama-2-70b-chat-awq and mixtral-instruct.
Could you please consider removing these examples and/or linking instead to the new, official Cog project for vLLM?