heshengtao / comfyui_LLM_party

Dify in ComfyUI includes Omost,GPT-sovits, ChatTTS,GOT-OCR2.0, and FLUX prompt nodes,access to Feishu,discord,and adapts to all llms with similar openai/gemini interfaces, such as o1,ollama, grok, qwen, GLM, deepseek, moonshot,doubao. Adapted to local llms, vlm, gguf such as llama-3.2, Linkage neo4j KG, graphRAG / RAG / html 2 img
GNU Affero General Public License v3.0
986 stars 93 forks source link

Is there a way to randomize and/or specify the seed in LLM? #76

Open anastasiuspernat opened 2 months ago

anastasiuspernat commented 2 months ago

Usually in all models there's a way to specify the seed value for reproducibility. In addition I noticed that LLM/Llama model subsequent calls produce very similar results. So it would be really really cool to have an option to specify the seed, or at least a switch to randomize it on every call. Thank you!

heshengtao commented 2 months ago

In general, increasing the temperature can increase the richness. You can try using the extra parameters node to set more parameters for the LLM.Both top k and top p affect the randomness of LLM.