issues
search
xlinx
/
sd-webui-decadetw-auto-prompt-llm
sd-webui-auto-prompt-llm
MIT License
39
stars
4
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
[BUG] cant run addon
#20
Rogal80
opened
13 hours ago
0
problems with the Prompt tokens when using Send to txt2image
#19
LadyFlames
opened
1 day ago
0
Install.py - no module nameed 'launch'
#18
Torcelllo
opened
4 days ago
1
cloud LLM via API key
#17
tazztone
opened
6 days ago
1
LM Studio: [ERROR] Model does not support images. Please use a model that does.. Error Data: n/a, Additional Data: n/a ]
#16
AlexDenthanor
opened
1 week ago
1
[feature] support wildcard or dynamic prompt to output more different results
#13
AhBumm
opened
1 week ago
14
suggestion: presets of sys prompts
#10
tazztone
closed
2 weeks ago
2
it would be a good idea to add a slightly longer LLM max length(tokens)
#9
LadyFlames
opened
2 weeks ago
3
Only User & Support Roles Are Supported
#8
AlexDenthanor
closed
2 weeks ago
3
[Forge] - Save_pil_to_file() got an unexpected keyword argument 'name'
#7
CCpt5
opened
3 weeks ago
5
prompt formatting issue?
#6
tazztone
closed
3 weeks ago
1
Develop hot fix
#5
xlinx
closed
4 weeks ago
0
Process prompts from a file and feed them to the LLM.
#4
caustiq
opened
1 month ago
1
Unload the LLM from VRAM after each call?
#3
Pdonor
opened
1 month ago
2
Fix
#2
w-e-w
closed
1 month ago
3
I think it would be worth adding a detection of whether lora is being used and if so, place the prompt in front of it, or move it to the end of the prompt?
#1
AndreyRGW
opened
1 month ago
2