Canner / WrenAI

🚀 An open-source SQL AI (Text-to-SQL) Agent that empowers data, product teams to chat with their data. 🤘
https://getwren.ai/oss
GNU Affero General Public License v3.0
2.04k stars 211 forks source link

Change OpenAI URL #330

Closed mobguang closed 4 months ago

mobguang commented 5 months ago

Is your feature request related to a problem? Please describe. How to change OpenAI URL when WrenAI start up?

Can I change the corresponding url when I set OpenAI key? image

Describe the solution you'd like

Describe alternatives you've considered

Additional context

cyyeh commented 5 months ago

@mobguang Hi, it seems that u would like to use LLMs other than OpenAI's? What LLM would u like to use instead?

mobguang commented 5 months ago

Hi @cyyeh ,

Thanks for your reply. Actually, I have no original OpenAI account, I just have a proxy account that can forward request to OpenAI, so I want to change OpenAI base url to the proxy url.

On the other hand, I do want to use LLMs, such as Meta-Llama-3-8B-Instruct and Mistral-7B-Instruct-v0.3.

Could you please kindly provide some instructions?

P.S., may I know whether WrenAI support convert Chinese query text to SQL?

Thanks in advance.

cyyeh commented 5 months ago

at the moment we do not officially support adding OpenAI base url, I've done some testing on these topics, and you can check out this link for further details: https://github.com/Canner/WrenAI/issues/277

may I know whether WrenAI support convert Chinese query text to SQL? -> officially no, since we don't thoroughly test this functionality, and it really depends on the LLM you're using though

mobguang commented 5 months ago

Thanks

cyyeh commented 5 months ago

We're now working hard on implementing our evaluation framework, and later we'll make ai pipelines contribution more easily by making them more transparent. After that, you can start easily adding your preferred LLM or other component and test them. At the moment, if you need to use other LLMs, you can refer to this document: https://docs.getwren.ai/installation/custom_llm

mobguang commented 5 months ago

We're now working hard on implementing our evaluation framework, and later we'll make ai pipelines contribution more easily by making them more transparent. After that, you can start easily adding your preferred LLM or other component and test them. At the moment, if you need to use other LLMs, you can refer to this document: https://docs.getwren.ai/installation/custom_llm

Great! Waiting for the good news. 😄

cyyeh commented 5 months ago

all, the ollama has been integrated in this branch, also you can use openai api compatible llm: chore/ai-service/update-env we'll merge this branch to the main branch in the near future and update the documentation as of now, I'll delete the original ollama branch Thank you all for your patience

related pr: https://github.com/Canner/WrenAI/pull/376

cyyeh commented 4 months ago

All, we now support using Ollama, Azure OpenAI, OpenAI API-compatible LLMs now with the latest release: https://github.com/Canner/WrenAI/releases/tag/0.6.0

Setups on how to run Wren AI using custom LLMs: https://docs.getwren.ai/installation/custom_llm#running-wren-ai-with-your-custom-llm-or-document-store

Currently, there is one obvious limitation for custom LLMs: you need to use the same provder(such as OpenAI, or Ollama) for LLM and embedding model. We'll fix that and release a new version soon. Stay tuned 🙂

I'll close this issue as completed now.

mobguang commented 4 months ago

@cyyeh Thanks for the good news!

Namec999 commented 4 months ago

Currently trying to use Groq as Open Ai compatible API provider

i am following the docs, but i think that i am missing something

can you show me the way to achieve this