issues
search
HelgeSverre
/
ollama-gui
A Web Interface for chatting with your local LLMs via the ollama API
https://ollama-gui.vercel.app/
MIT License
515
stars
84
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
docker instructions using wrong port?
#39
breadcan
opened
2 weeks ago
0
Fix Chinese input method bug
#38
axliupore
opened
1 month ago
0
Confirm dialog
#37
izacximenes
opened
1 month ago
0
Ollama Model Not Loading in GUI When Using ngrok Tunnels
#36
SecTech12
opened
1 month ago
0
Accessing Ollama GUI via Local IP Doesn’t Load LLaMA 3 (localhost:8080 Works)
#35
SecTech12
closed
1 month ago
2
fix: use the user's system locale for date formatting
#34
samhoooo
opened
2 months ago
0
simplify customization using .env file
#33
bitsnaps
opened
2 months ago
0
Initial characters from a response are split across answers and are in incorrect order
#32
svenha
opened
3 months ago
0
It can only visit by localhost:5174, when I visit the webui from another IP, it doesn't work
#31
gogo7707
closed
3 months ago
1
html output is not tidy
#30
uccritae
opened
4 months ago
1
Feature Request: allow "editing" the LLM's response
#29
DeflateAwning
closed
3 months ago
2
Alternative installation method
#28
evrenesat
opened
4 months ago
0
Update README.md
#27
coder-examples
closed
3 months ago
0
Make locale configurable
#26
dionyziz
opened
6 months ago
3
Request: tag releases on GitHub
#25
DeflateAwning
closed
3 months ago
3
[WIP] Add Dockerfile and update Vite configuration
#24
LuisMalhadas
closed
3 months ago
1
:sparkles: Add config to db to add custom system prompt to chat
#23
theobaronnat
closed
3 months ago
2
Allow upload images for multimodal chats
#22
brdebr
opened
7 months ago
0
Fixed the CSS Bug where STRONG tags don't show correctly in dark mode, documented in Issues
#21
MagicPlants
closed
7 months ago
0
CSS Bug in AI Response Prose (Dark Mode)
#20
MagicPlants
opened
7 months ago
2
Response parsing logic fixes!
#19
felixphixer
opened
7 months ago
5
Add support for multimodal models
#18
andreaferretti
opened
7 months ago
0
Can't display generated responses
#17
BoredManCodes
opened
7 months ago
1
Using Ollama chat API
#16
emsi
opened
8 months ago
7
Can't install
#15
HttpAnimation
closed
8 months ago
4
Feature Request: Abort on request
#14
melroy89
closed
9 months ago
5
Feature Request: See busy feedback indication
#13
melroy89
closed
9 months ago
3
Scroll-follow improvements, Dynamic BaseURL and Dependency Upgrades
#12
HelgeSverre
closed
9 months ago
0
Changing the API URL has no effect
#11
tmfksoft
closed
9 months ago
2
Update README.md
#10
codevalley
closed
9 months ago
0
Updated README prerequisites section
#9
codevalley
closed
9 months ago
0
Models are not loaded
#8
dxcore35
closed
10 months ago
2
Trying to select another model leads to resetting to the first option in the list
#7
mihaim
closed
10 months ago
4
No code formatting
#6
witfyl-ravped
closed
10 months ago
2
Flipping between chat tabs during output causes output to be incorrectly placed in another chat
#5
Cliftonz
closed
10 months ago
1
Possibility to delete text in chat
#4
schauppi
closed
11 months ago
6
The answer is incomplete
#3
mutse
closed
11 months ago
1
Readme. Use https when cloning repository
#2
avezov
closed
11 months ago
0
Git clone url is wrong
#1
KristianRykkje
closed
11 months ago
2