issues
search
turboderp
/
exui
Web UI for ExLlamaV2
MIT License
449
stars
43
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Prompt format for Qwen?
#67
charleswg
closed
1 month ago
0
Cuda jit_compile error
#66
charleswg
closed
1 month ago
1
Feature request: reordering list items
#65
Zueuk
opened
2 months ago
0
Add DRY and Banned Strings to Notepad
#64
Downtown-Case
opened
2 months ago
0
Add option to copy block of text to clipboard.
#63
LlamaEnjoyer
closed
2 months ago
0
Dry Sampler Support
#62
alexbrowngh
opened
2 months ago
0
NameError: name 'exllamav2_ext' is not defined
#61
jwax33
closed
2 months ago
5
Continue has a race
#60
IMbackK
opened
4 months ago
0
Model download URLs?
#59
SoftologyPro
opened
4 months ago
1
Notepad: generate and token buttons disabled before writeing something in the context
#58
IMbackK
opened
4 months ago
2
Determinism
#57
IMbackK
closed
4 months ago
2
gguf models
#56
DjagbleyEmmanuel
opened
5 months ago
2
UI Loads to a blank grey screen
#55
fake-name
opened
6 months ago
1
Running server from different folder raises Internal Server Error.
#54
fatality14
opened
6 months ago
1
server.py won't open
#53
ReMeDy-TV
opened
7 months ago
2
Optional code to use a local copy of marked and ensure interface can run offline
#52
dandm1
opened
7 months ago
0
Add tooltip to briefly explain the parameters in chat tab
#51
dagbdagb
opened
7 months ago
0
Permit setting default promt format and max tokens in models tab
#50
dagbdagb
opened
7 months ago
0
Feature request - Colorized Code Markup
#49
chaincrafter
opened
7 months ago
0
Update prompts.py with Llama-3 prompt format.
#48
cmhamiche
closed
7 months ago
3
Launching throws error with Exllamav2 stating "Tokenizers" are not found
#47
Adzeiros
opened
7 months ago
2
Out of memory
#46
AICodingTester
opened
8 months ago
16
how to name the sessions?
#45
Fuckingnameless
closed
8 months ago
4
how to set Q4 cache on?
#44
Fuckingnameless
closed
8 months ago
0
Edit context
#43
dagbdagb
closed
7 months ago
3
Client-side "save to file"-function for code output?
#42
dagbdagb
opened
8 months ago
1
Answer does not escape HTML tags
#41
chaincrafter
closed
7 months ago
2
DLL load failed while importing exllamav2_ext
#40
jagerius
opened
8 months ago
2
VRAM Usage
#39
ec111
opened
8 months ago
5
CUDA_HOME variable not set
#38
EidosL
opened
8 months ago
4
ImportError: DLL load failed while importing exllamav2_ext: The specified procedure could not be found.
#37
Zueuk
opened
8 months ago
0
"copy text" button in code ouput windows?
#36
dagbdagb
closed
8 months ago
3
Make text field compatible with the LanguageTool browser extension.
#35
brucethemoose
closed
9 months ago
2
ninja: build stopped: subcommand failed.
#34
kurugai
closed
9 months ago
2
`exui` doesn't escape HTML tags in user input.
#32
andrewgross
opened
10 months ago
3
Windows 11 install procedure and missing dependancies (transformers)
#31
Anon426
opened
10 months ago
2
Multiple Roles wont answer or interogate with each other.
#30
chaincrafter
closed
10 months ago
2
Error "architectures"
#29
Gyramuur
closed
11 months ago
2
exui not able to connect to server from Win10 browser - WSL2 Debian
#28
mindkrypted
opened
11 months ago
1
Issues with window management (CachyOS / Arch)
#27
JakeSSRN
opened
11 months ago
0
Not finding CUDA when it's installed both in conda env and for Windows globally (different versions)
#26
ewebgh33
opened
11 months ago
10
Error answering question, AssertionError: filter excludes all tags
#25
xldistance
opened
11 months ago
0
Will this ever have a mobile version?
#24
lovee333
opened
11 months ago
0
How to increase max_tokens to 4096
#23
xldistance
closed
11 months ago
2
KeyError: '<ï½\x9cendâ\x96\x81ofâ\x96\x81sentenceï½\x9c>' Tokenizer crash
#22
SinanAkkoyun
opened
12 months ago
0
Set ropescale of draft model
#21
SinanAkkoyun
closed
12 months ago
1
Speculative Decoding slower (only in ExUI)
#20
SinanAkkoyun
closed
12 months ago
3
Auto load on multi gpus
#19
wangyu1997
closed
12 months ago
2
build error
#18
deeeed
closed
12 months ago
2
vllm backend?
#17
ekg
opened
1 year ago
1
Next