stavsap / comfyui-ollama

Apache License 2.0
219 stars 19 forks source link

Prompt outputs failed validation #20

Open karl0ss opened 2 weeks ago

karl0ss commented 2 weeks ago

I am running ollama remotly, and when I enter my hosted url, it updates the list of models, but it seems you can only run it with a certain few image

Prompt outputs failed validation
OllamaGenerateAdvance:
- Value not in list: model: 'brxce/stable-diffusion-prompt-generator:latest' not in ['llama3:8b-instruct-q4_K_M', 'llama3', 'phi3:3.8b-mini-instruct-4k-q4_K_M', 'phi3', 'phi3:3.8b-mini-instruct-4k-fp16']

Is this correct?

stavsap commented 2 weeks ago

no, it should display all the models you have similar to ollama list command. then select one of them and then it should work with the selected model.

can you share the console log?

karl0ss commented 2 weeks ago

I will when back at my pc, as you can see from the screenshot, it does display the list of my models from the remote instance, but when I press to queue the generation, it throws the error..

karl0ss commented 2 weeks ago

To see the GUI go to: http://0.0.0.0:8188
HTTP Request: GET http://192.168.4.10:11434/api/tags "HTTP/1.1 200 OK"
got prompt
Failed to validate prompt for output 9:
* OllamaGenerateAdvance 10:
  - Value not in list: model: 'brxce/stable-diffusion-prompt-generator:latest' not in ['llama3:8b-instruct-q4_K_M', 'llama3', 'phi3:3.8b-mini-instruct-4k-q4_K_M', 'phi3', 'phi3:3.8b-mini-instruct-4k-fp16']
Output will be ignored
Failed to validate prompt for output 11:
Output will be ignored
invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}

Attached is the reposne from the /api/tags request

response.json

I downloaded llama3:8b-instruct-q4_K_M and if i select that, it does work, but strange it wont allow me to select from the returned list of models

stavsap commented 1 week ago

image

tried to reproduce it without success seems to work fine with this model

does the list you see in the drop down is the same as with command ollama list?

do you have also ollama on localhost?

karl0ss commented 1 week ago

I don't have ollama local, its running in a lxc on the same host though, and is being used remotely like this for OpenWebUI as well as Conntinue, so that works ok.

The returned list, as in my screenshot, is the list of models that I have running on my server, and ties up with the response.json i sent, but as soon as i press queue prompt, it gives that error, like it didn't update the "list" variable with the response from my ollama instance, but is still using the "preset/demo" list...

It doesn't matter what model I select, it will only work, if I download one of the exact same models that it has as the "default"

stavsap commented 1 week ago

there is no preset list it dynamically populated from the api call to tags.

are you on the latest?

try to refresh the browser, to force list update you can select the url input, by pressing ok it will re-fetch the list.

stavsap commented 1 week ago

this is the model update hook

https://github.com/stavsap/comfyui-ollama/blob/main/web/js/OllamaNode.js

each time the url form is updated there is a fetch of model names.