stavsap / comfyui-ollama

Apache License 2.0
372 stars 34 forks source link

Prompt outputs failed validation #20

Closed karl0ss closed 2 months ago

karl0ss commented 4 months ago

I am running ollama remotly, and when I enter my hosted url, it updates the list of models, but it seems you can only run it with a certain few image

Prompt outputs failed validation
OllamaGenerateAdvance:
- Value not in list: model: 'brxce/stable-diffusion-prompt-generator:latest' not in ['llama3:8b-instruct-q4_K_M', 'llama3', 'phi3:3.8b-mini-instruct-4k-q4_K_M', 'phi3', 'phi3:3.8b-mini-instruct-4k-fp16']

Is this correct?

stavsap commented 4 months ago

no, it should display all the models you have similar to ollama list command. then select one of them and then it should work with the selected model.

can you share the console log?

karl0ss commented 4 months ago

I will when back at my pc, as you can see from the screenshot, it does display the list of my models from the remote instance, but when I press to queue the generation, it throws the error..

karl0ss commented 4 months ago

To see the GUI go to: http://0.0.0.0:8188
HTTP Request: GET http://192.168.4.10:11434/api/tags "HTTP/1.1 200 OK"
got prompt
Failed to validate prompt for output 9:
* OllamaGenerateAdvance 10:
  - Value not in list: model: 'brxce/stable-diffusion-prompt-generator:latest' not in ['llama3:8b-instruct-q4_K_M', 'llama3', 'phi3:3.8b-mini-instruct-4k-q4_K_M', 'phi3', 'phi3:3.8b-mini-instruct-4k-fp16']
Output will be ignored
Failed to validate prompt for output 11:
Output will be ignored
invalid prompt: {'type': 'prompt_outputs_failed_validation', 'message': 'Prompt outputs failed validation', 'details': '', 'extra_info': {}}

Attached is the reposne from the /api/tags request

response.json

I downloaded llama3:8b-instruct-q4_K_M and if i select that, it does work, but strange it wont allow me to select from the returned list of models

stavsap commented 4 months ago

image

tried to reproduce it without success seems to work fine with this model

does the list you see in the drop down is the same as with command ollama list?

do you have also ollama on localhost?

karl0ss commented 4 months ago

I don't have ollama local, its running in a lxc on the same host though, and is being used remotely like this for OpenWebUI as well as Conntinue, so that works ok.

The returned list, as in my screenshot, is the list of models that I have running on my server, and ties up with the response.json i sent, but as soon as i press queue prompt, it gives that error, like it didn't update the "list" variable with the response from my ollama instance, but is still using the "preset/demo" list...

It doesn't matter what model I select, it will only work, if I download one of the exact same models that it has as the "default"

stavsap commented 4 months ago

there is no preset list it dynamically populated from the api call to tags.

are you on the latest?

try to refresh the browser, to force list update you can select the url input, by pressing ok it will re-fetch the list.

stavsap commented 4 months ago

this is the model update hook

https://github.com/stavsap/comfyui-ollama/blob/main/web/js/OllamaNode.js

each time the url form is updated there is a fetch of model names.

stavsap commented 2 months ago

is this still an issue?

wings4ever commented 2 months ago

man, see custom_nodes\ComfyUi-Ollama-YN\CompfyuiOllama.py,line188 & 305 add your model into the list like "model": (【"llama3:8b-instruct-q4_K_M", "qwen2:7b", "llama3", "phi3:3.8b-mini-instruct-4k-q4_K_M", "phi3", "phi3:3.8b-mini-instruct-4k-fp16"】,), save, restart comfyui, solved

stavsap commented 2 months ago

ComfyUi-Ollama-YN its not this repo.... i think you referring to this one https://github.com/wujm424606/ComfyUi-Ollama-YN