The Ollama Provider already allows to set the Base host, so I can use a Ollama instance on a remote server. However, if the remote server has some kind of authentication (e.g. in a proxy) I cannot set it as in LLaMA C/C++ (Local) or Custom OpenAI provider.
This way it would be possible to use a remote server with Ollama and hopefully be able to easily switch between the models.
Proposed solution
Also support Authentication or custom headers for the Ollama Provider.
Describe the need of your request
The Ollama Provider already allows to set the Base host, so I can use a Ollama instance on a remote server. However, if the remote server has some kind of authentication (e.g. in a proxy) I cannot set it as in LLaMA C/C++ (Local) or Custom OpenAI provider.
This way it would be possible to use a remote server with Ollama and hopefully be able to easily switch between the models.
Proposed solution
Also support Authentication or custom headers for the Ollama Provider.
Additional context
No response