heshengtao / comfyui_LLM_party

LLM Agent Framework in ComfyUI includes Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai/gemini interfaces, such as o1,ollama, grok, qwen, GLM, deepseek, moonshot,doubao. Adapted to local llms, vlm, gguf such as llama-3.2, Linkage neo4j KG, graphRAG / RAG / html 2 img
GNU Affero General Public License v3.0
1.06k stars 94 forks source link

Model selection uses a drop-down list #74

Closed alex13by closed 3 months ago

alex13by commented 3 months ago

It is very inconvenient to fill in the manual, but also go to the directory to check whether there is this model

heshengtao commented 3 months ago

I think some users need freedom. The current approach allows users to fill in models from any OpenAI interface, or even create their own model APIs. Ollama models are also increasing day by day, and there are many, many ways to call ollama, and their model library is also extremely large. I don't think I can keep most models out of the LLM party for a little convenience. If you read my how_to_use.md carefully, you will find that there is a node called "Load Model Name" that allows you to load your configured model name from the party's config.ini file. This is my balance between freedom and convenience. If you think it is still not convenient enough, please come up with a better solution. The open source community is maintained by everyone and serves the entire community. I will not make other users inconvenient because of your personal convenience.