issues
search
stavsap
/
comfyui-ollama
Apache License 2.0
372
stars
34
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
how to free gpu used
#66
Chengym2023
opened
16 hours ago
1
OllamaGenerateAdvance: invalid literal for int() with base 10
#65
Amazon90
opened
6 days ago
2
Allow to keepalive to be hours or seconds [feature request]
#64
bigcat88
opened
1 week ago
1
URL and Model load separately
#63
Khampol
closed
14 hours ago
2
Unload Model : how?
#62
Khampol
closed
14 hours ago
6
Can you add openwebui auth for this tool?
#61
morgan55555
closed
14 hours ago
1
What is the best system instruction?
#60
gonzalu
closed
14 hours ago
2
keep_alive need a '-1' option (keep alive forever)
#59
morgan55555
opened
2 weeks ago
4
Batch cue words.ComfyUI Error Report, KeyError:'context',request help
#58
lopezguan
opened
2 weeks ago
1
Ollama vision isn't working
#57
Autantpourmoi
closed
3 weeks ago
1
x/llama3.2-vision:latest not working?
#56
Latentnaut
closed
14 hours ago
9
Web Search function?
#55
Khampol
opened
4 weeks ago
0
I wish there was some intructions on how to install llama instruct
#54
ClothingAI
closed
14 hours ago
4
PRE REQUESITES?
#53
ClothingAI
closed
4 weeks ago
2
Add seed input to Ollamavision
#52
vilanele
closed
4 weeks ago
2
What does keep_context do?
#51
Jonseed
closed
14 hours ago
3
OllamaVision must provide a model
#50
Weydap
opened
1 month ago
4
Feature request: Add 'format' parameter to get valid JSON responses
#49
Jonseed
closed
1 month ago
6
how to summary video with vlm models
#48
whmc76
opened
1 month ago
4
The model options cannot pop up a window to enter data
#47
iwin99
closed
14 hours ago
3
Model fetching does not work with ollama 0.3.12
#46
RaemyS
closed
1 month ago
6
text not show in Text repo
#45
Yanquansu
opened
1 month ago
11
Debug option not working
#44
DanWalsh82
closed
1 month ago
3
The qwen2.5 system command does not work
#43
zhaoqi571436204
closed
1 month ago
1
[win 10054]The remote host forcibly closes an existing connection.
#42
Sui-zzZ
closed
2 months ago
0
generate-advance workflow
#41
lunatico67
closed
1 month ago
1
model undefined
#40
jonny7737
closed
2 months ago
0
my computer have 5 ollama models, but in the "ollama generate "node just find one
#39
maxstewm
closed
2 months ago
2
Update CompfyuiOllama.py
#38
RandomGitUser321
closed
2 months ago
1
about http 503 error code,a suggest for chinese user
#37
ZzzasdfghjklZzz
closed
2 months ago
2
这个节点非常棒,希望增加一个显存常驻选项
#36
JonesChou
closed
14 hours ago
0
Comfyui-ollama Node Name Display Error
#35
delcompan
closed
2 months ago
1
help!
#34
slash1224
closed
2 months ago
7
i got this issues, how to deal this
#33
zmczmc123654
closed
2 months ago
4
再次启动comfyui,模型就会不翼而飞?
#32
JioJe
closed
2 months ago
7
Refreshing comfyui (refresh button) sets model to undefined
#31
brianmlatimer
closed
2 months ago
2
Ollama-YN taking over this node
#30
Niutonian
closed
3 months ago
1
无法显示模型
#29
songhao664
closed
2 months ago
10
Update PyProject Toml - License
#28
haohaocreates
closed
3 months ago
0
not working....
#27
K-O-N-B
closed
3 months ago
2
lease help
#26
missscott
closed
3 months ago
1
Would it allow me to save and read the context as a file?
#25
mahougigi
closed
3 months ago
13
basic example failed with error 500
#24
cherishh
closed
3 months ago
1
getting a 'load_duration' and sometimes a 'context' error.
#23
Appolonius001
closed
3 months ago
11
The following error was encountered while trying to retrieve the URL: http://127.0.0.1:11434/api/generate
#22
BayJ233
closed
2 months ago
1
导入失败
#21
JioJe
closed
4 months ago
6
Prompt outputs failed validation
#20
karl0ss
closed
2 months ago
10
automatic pulling of the model [feature]
#19
bigcat88
opened
4 months ago
1
Add one round chat node using ollama.Client.chat
#18
LingXuanYin
closed
3 weeks ago
3
Option to remember context between runs
#17
bezo97
closed
5 months ago
0
Next