if-ai / ComfyUI-IF_AI_tools

ComfyUI-IF_AI_tools is a set of custom nodes for ComfyUI that allows you to generate prompts using a local Large Language Model (LLM) via Ollama. This tool enables you to enhance your image generation workflow by leveraging the power of language models.
https://ko-fi.com/impactframes
366 stars 27 forks source link

Dear if-ai, could you please provide a ai larger language model? #37

Closed wangzi7654321 closed 1 month ago

wangzi7654321 commented 1 month ago

Dear if-ai,

I hope you don’t mind the intrusion. Could you possibly provide a larger version of the if-ai large language model? For instance, a quantized version of llama3 70B. My GPU has 24GB of memory, and the current if-ai large language model you offer on Hugging Face is too small for my needs.

I would be extremely grateful!

if-ai commented 1 month ago

Hi, Thank you, I released the lora for people that want to attach it to bigger models. Currently my system can't deal with big models well and I have limited budget to train stuff I want to train a model that automatically make workflows and another that can build custom nodes but honestly I have spend way too much on this models and making the datasets without any collaboration so I can't right now. Also the bigger models will improve the prompts but it should not be such a great difference.

wangzi7654321 commented 1 month ago

Hello, thank you for your hard work and dedication. May I ask if this is the LoRA of the llama3 large language model?https://huggingface.co/impactframes/IF_AI_SD_PromptMkr

if-ai commented 1 month ago

Yes, that should work on llama 70b version even on other finetunes as long as they are llama3. I will close this anyway I will try to do a 70B when I have a chance

CoUIfy commented 2 weeks ago

Hello, thank you for your hard work and dedication. May I ask if this is the LoRA of the llama3 large language model?https://huggingface.co/impactframes/IF_AI_SD_PromptMkr

Can you explain step by step how to use this? Also could you tell me how to install models other than the usual "ollama run xModel"? (for instance proteus model that is recommanded in the repo?)