Replace OpenAI GPT with another LLM in your app by changing a single line of code. Xinference gives you the freedom to use any LLM you need. With Xinference, you're empowered to run inference with any open-source language models, speech recognition models, and multimodal models, whether in the cloud, on-premises, or even on your laptop.
Is your feature request related to a problem? Please describe
Currently, Xinference does not support classification models such as BERT-base-chinese. In our project, classification models are hardcoded directly. If Xinference could support these models, it would be greatly beneficial to us.
Describe the solution you'd like
We would like Xinference to support classification models, enabling us to use models like BERT-base-chinese seamlessly within our project.
Describe alternatives you've considered
We have considered continuing to hardcode classification models directly into our project, but this approach is not ideal and lacks flexibility.
Is your feature request related to a problem? Please describe
Currently, Xinference does not support classification models such as BERT-base-chinese. In our project, classification models are hardcoded directly. If Xinference could support these models, it would be greatly beneficial to us.
Describe the solution you'd like
We would like Xinference to support classification models, enabling us to use models like BERT-base-chinese seamlessly within our project.
Describe alternatives you've considered
We have considered continuing to hardcode classification models directly into our project, but this approach is not ideal and lacks flexibility.