heshengtao / comfyui_LLM_party

LLM Agent Framework in ComfyUI includes Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai/gemini interfaces, such as o1,ollama, grok, qwen, GLM, deepseek, moonshot,doubao. Adapted to local llms, vlm, gguf such as llama-3.2, Linkage neo4j KG, graphRAG / RAG / html 2 img
GNU Affero General Public License v3.0
1.05k stars 93 forks source link

'Chat' object has no attribute 'parameters' #65

Closed davedaverave closed 3 months ago

davedaverave commented 3 months ago

Screenshot 2024-08-11 120301 Am i doing something wrong with ollama or not using llm party correctly?

heshengtao commented 3 months ago

The role of ollama is to turn your local model into an API, so you use the API loader, and you also need to use the API model chain. You can refer to this workflow file. https://github.com/heshengtao/comfyui_LLM_party/blob/main/workflow_tutorial/LLM_Party%20for%20API%20Models.json

There are sample workflows inside, https://github.com/heshengtao/comfyui_LLM_party/blob/main/workflow_tutorial/

There are more workflows I didn't organize https://github.com/heshengtao/comfyui_LLM_party/blob/main/workflow/