heshengtao / comfyui_LLM_party

LLM Agent Framework in ComfyUI includes Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai/gemini interfaces, such as o1,ollama, grok, qwen, GLM, deepseek, moonshot,doubao. Adapted to local llms, vlm, gguf such as llama-3.2, Linkage neo4j KG, graphRAG / RAG / html 2 img
GNU Affero General Public License v3.0
1.06k stars 94 forks source link

Seems LLM Party and Reactor Nodes can't co-exist in the same env #105

Open boricuapab opened 1 month ago

boricuapab commented 1 month ago

I have to disable the comfy reactor node to be able to use llm party in my env, and vice versa, though I'm not sure which python packages are conflicting between the two custom nodes.

heshengtao commented 1 month ago

Is there any more error information, and if it exists at the same time, what error will the console report?

boricuapab commented 1 month ago

There isn't an error reported, the cli window just hangs and doesn't launch comfy, but after I disable reactor comfy is able to launch with the llm party custom node pack installed.

llmPartyNReactorNotCoexistingIssue

RuKapSan commented 1 month ago

If anyone has encountered a similar problem, they should simply downgrade torch to version 2.2 or in my case 2.1.1 worked pip install torch==2.1.1 torchaudio torchvision

boricuapab commented 1 month ago

I see, I've been using torch nightly as I'm using some of the recent video models such as cog video x and mochi locally