Open jovicon opened 4 months ago
Hi, thanks for submitting this issue. At the moment, we only support the Ollama endpoint http://localhost:11434/v1
so if you have it running elsewhere this would cause the error.
Is it possible to take this as an issue with a Merge request and add a custom field to add the local endpoint?
@jovicon we've been migrating these runbook prompts to allow you to specify the url in the prompt. So you'll be able to specify the location of the OpenAI compatible endpoint when you run the prompt. It's a bit of a different flow than what we used previously but it will be more open to edits. I'll keep this issue open until we have a doc describing how you can try this.
Hi, I tried make runbook v0.0.10
and I had this problem:
This is my docker file:
and this is my extension config: