Closed euxx closed 4 weeks ago
Hi, Vanna is great. We have been integrating Vanna and Xinference recently.
Xinference is an open-source platform that can manage many open-source AI models.
We can use all the popular LLMs with Xinference, including: LLaMA and Qwen.
So the implementation may be helpful to others.
Project pages: https://github.com/xorbitsai/inference https://github.com/xorbitsai/inference-client
Basic Use:
$ pip install 'vanna[xinference-client]'
from vanna.vannadb import VannaDB_VectorStore from vanna.xinference import Xinference class VannaXinference(VannaDB_VectorStore, Xinference): def __init__(self, config=None): VannaDB_VectorStore.__init__(self, vanna_model=MY_VANNA_MODEL, vanna_api_key=MY_VANNA_API_KEY, config=config) Xinference.__init__(self, config=config) vn_xinference = VannaXinference(config={'base_url': 'http://localhost:9997', 'model_uid': 'qwen2.5-instruct-14B'})
I verified it locally, and the following test passed:
vn_xinference = VannaXinference(config={'base_url': 'http://localhost:9997', 'model_uid': 'qwen2.5-instruct-14B'}) vn_xinference.connect_to_sqlite('https://vanna.ai/Chinook.sqlite') def test_vn_xinference(): sql = vn_xinference.generate_sql("What are the top 4 customers by sales?") df = vn_xinference.run_sql(sql) assert len(df) == 4
Hi, Vanna is great. We have been integrating Vanna and Xinference recently.
Xinference is an open-source platform that can manage many open-source AI models.
We can use all the popular LLMs with Xinference, including: LLaMA and Qwen.
So the implementation may be helpful to others.
Project pages: https://github.com/xorbitsai/inference https://github.com/xorbitsai/inference-client
Basic Use:
$ pip install 'vanna[xinference-client]'
I verified it locally, and the following test passed: