Closed i-ate-a-vm closed 2 months ago
Hello, do you know what the correct response from ollama webui is? I am having issues running this API now since it has been updated, many thanks. Edit: Should be fixed in latest release.
Executive decision: Individual configuration issues will receive an answer and will be considered closed after 24 hours of no replies. Many thanks.
Hi @rjmacarthy, should this be fixed in the latest Twinny release or Ollama release? I pulled the latest Twinny version and still see the issue. I'd appreciate a pointer to the bug in Ollama you saw resolved if that was what you were referring to. Thanks so much!
EDIT, scratch that. I'm still seeing an error but it is a different one, so I'll open a separate issue.
Hey, the fim endpoint should be /ollama/api/generate
if using open web UI. It was fixed here https://github.com/rjmacarthy/twinny/commit/60ee0fd44f1820e54518848f497f2923f9523ece and I tested it working. If still an issue please check your configuration.
Describe the bug While attempting to use a remote Open WebUI server for FIM, I am experiencing a TypeError.
To Reproduce Steps to reproduce the behavior:
Expected behavior I expect the code autocompletion to be generated on the Open WebUI server successfully, and for the resulting code to be correctly inserted into the code file.
Logging I was able to find this error in the Extension logs:
Desktop (please complete the following information):
Additional context The Open WebUI server works successfully for chat, and I can see in its logs that requests are reaching the
/api/generate
route successfully, suggesting that it isn't a connectivity issue.