Open hgupta12 opened 2 weeks ago
@hgupta12 can we work on this issue to make it more dynamic like making an automated check if users machine has ollama running or if they have ollma hosted on the network -then to list the available models on the users machine like (llama3.1,mistral,tinyllama)etc this give users more flexiblity to chose thiere prefered model or models that are more accesible to them i was thinking we let them configure and we just provide a ui for the user to do so
i went through your pull request and its almost there , i think if you modify this to make it more dynamic to adapt to users it would be a great fallback layer instead of using open AI.
Not to mention the Security benifits of data not leaving your machine or your local network
Hey @ryzxxn, thanks for the input! I'll update the PR accordingly.
Currently the Export AI feature works only if the user has an OpenAI key. It would be nice to have support for Ollama as fallback if the OpenAI key is not present.