Open ehsan2003 opened 5 months ago
is it possible to use chatgpt ( with an api key ) instead of local model?
With the new OpenAPI-compatible Ollama endpoints, this is definitely possible. I'll have to look into any potential downsides to changing the endpoints we call.
is it possible to use chatgpt ( with an api key ) instead of local model?