xlinx / sd-webui-decadetw-auto-prompt-llm

sd-webui-auto-prompt-llm
MIT License
41 stars 5 forks source link

it would be a good idea to add a slightly longer LLM max length(tokens) #9

Open LadyFlames opened 3 weeks ago

LadyFlames commented 3 weeks ago

i think it might be a good idea to make the LLM max length a bit longer or more like let the LLM finish before it stops because half the LLM answer is missing or its incomplete most times and sometimes it reguires several dozen calls to the LLM until it gives a completed Answer and it should be changed to something like 750-1000 Max lenght(Tokens) maybe even higher incomplete

xlinx commented 2 weeks ago

okay, got it. now is 5k, and also top_k and top_p next version. when u click generate button; it will send into SD, do u really want send more then 1000 token into SD clip encoder?

update... after i try for FLUX. hnn, so many token bring so many details. yep, we need .

LadyFlames commented 2 weeks ago

i did some testing myself in SD forge using Flux i got this result with 500 Max length Tokens and 20 top k and 0.9 top p and 0.7 LLM temperature 00011-42929162 more max length tokens might be better to be used with flux

xlinx commented 2 weeks ago

Its 5000, now. Max token has set max to 5000 few day ago.