Closed davedaverave closed 3 months ago
The role of ollama is to turn your local model into an API, so you use the API loader, and you also need to use the API model chain. You can refer to this workflow file. https://github.com/heshengtao/comfyui_LLM_party/blob/main/workflow_tutorial/LLM_Party%20for%20API%20Models.json
There are sample workflows inside, https://github.com/heshengtao/comfyui_LLM_party/blob/main/workflow_tutorial/
There are more workflows I didn't organize https://github.com/heshengtao/comfyui_LLM_party/blob/main/workflow/
Am i doing something wrong with ollama or not using llm party correctly?