Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
Describe the bug
When i use xinference to run codellama-70b-instruct. It output a set of unrelated text.
Just like below:![image](https://github.com/xorbitsai/inference/assets/16621867/256e1a1d-6cea-4760-ad4b-aaa4a3ddcafd)
And there is no anything about
codellama-70b-instruct
in the doc(https://inference.readthedocs.io/zh-cn/latest/models/builtin/llm/code-llama-instruct.html).So does Xinference not support codellama-70b-instruct???
If it's due to lack of support, is there any plan to support codellama-70 in the future?
Could you give me a hand about this. Thanks a lot