Closed cainmagi closed 1 month ago
It appears to already be. If you look at the Python example it mentions how to point it at other models, and links to LiteLLM for further info on how to link other models than the one in the examples. LiteLLM appears to support Ollama (and probably other local models), so you can already run it through whatever you want.
Thank you! I will take a look at this.
Ask a question
I noticed this awesome project today. However, I think it currently requires users to send the document to OpenAI via APIs, which may be a great concern to me. Is there a plan for providing an offline version (e.g., using another locally deployed offline model to replace the GPT model) in the future? Thank you!