Closed BreckanM closed 1 week ago
Hey @BreckanM thanks for the report. Are you able to share the error you get when trying to specify a version of the model other than latest?
Ah, it sounds like this is a comfyui griptape nodes bug - I can take a look at this on Monday :)
fix merged
thanks. works perfectly. cheers for the quick fix.
My pleasure, glad it worked :)
On Tue, Jun 25, 2024 at 12:01 AM BreckanM @.***> wrote:
thanks. works perfectly. cheers for the quick fix.
— Reply to this email directly, view it on GitHub https://github.com/griptape-ai/ComfyUI-Griptape/issues/33#issuecomment-2186411533, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABJFWYIJIBRKVD7STJFD66DZJADCFAVCNFSM6AAAAABJYPNNPKVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDCOBWGQYTCNJTGM . You are receiving this because you modified the open/close state.Message ID: @.***>
Is your feature request related to a problem? Please describe. when the Ollama Agent Config pulls a list of models from Ollama, it grabs all models and shows only the model main family name (but not any variant) this means the GripTape list can show multiple of the same model when they care in fact differnet variants (eg :small vs :medium, :2b vs :7b).
when executing, it seems that only a model with ":latest" is actually usable. Ollama throws a model not found error if the model in ollama doesnt match:"latest
Describe the solution you'd like please enumerate the full model spec from ollama, and then use the full model when posting the prompt to ollama
eg : Phi3:Medium Phi3:14b Dolphin-Mixtral:8x7B
Describe alternatives you've considered n/a
Additional context