Would it be possible to integrate with Ollama, in addition to OpenAI for those of us who would like a self-hosted solution?
I'm no JS dev, but looking at the code it seems the use of OpenAI is fairly localized, and the Ollama API is not overly complex either. You would need to add an extra config for what model is being used, though.
Would it be possible to integrate with Ollama, in addition to OpenAI for those of us who would like a self-hosted solution?
I'm no JS dev, but looking at the code it seems the use of OpenAI is fairly localized, and the Ollama API is not overly complex either. You would need to add an extra config for what model is being used, though.