issues
search
stavsap
/
comfyui-ollama
Apache License 2.0
335
stars
29
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
The model options cannot pop up a window to enter data
#47
iwin99
opened
2 days ago
1
Model fetching does not work with ollama 0.3.12
#46
RaemyS
closed
3 days ago
6
text not show in Text repo
#45
Yanquansu
opened
1 week ago
5
Debug option not working
#44
DanWalsh82
closed
1 week ago
3
The qwen2.5 system command does not work
#43
zhaoqi571436204
closed
1 week ago
1
[win 10054]The remote host forcibly closes an existing connection.
#42
Sui-zzZ
closed
3 weeks ago
0
generate-advance workflow
#41
lunatico67
closed
5 days ago
1
model undefined
#40
jonny7737
closed
3 weeks ago
0
my computer have 5 ollama models, but in the "ollama generate "node just find one
#39
maxstewm
closed
4 weeks ago
2
Update CompfyuiOllama.py
#38
RandomGitUser321
closed
1 month ago
1
about http 503 error code,a suggest for chinese user
#37
ZzzasdfghjklZzz
closed
4 weeks ago
2
这个节点非常棒,希望增加一个显存常驻选项
#36
JonesChou
opened
1 month ago
0
Comfyui-ollama Node Name Display Error
#35
delcompan
closed
4 weeks ago
1
help!
#34
slash1224
closed
3 weeks ago
7
i got this issues, how to deal this
#33
zmczmc123654
closed
1 month ago
4
再次启动comfyui,模型就会不翼而飞?
#32
JioJe
closed
1 month ago
7
Refreshing comfyui (refresh button) sets model to undefined
#31
brianmlatimer
closed
1 month ago
2
Ollama-YN taking over this node
#30
Niutonian
closed
1 month ago
1
无法显示模型
#29
songhao664
closed
1 month ago
10
Update PyProject Toml - License
#28
haohaocreates
closed
1 month ago
0
not working....
#27
K-O-N-B
closed
2 months ago
2
lease help
#26
missscott
closed
2 months ago
1
Would it allow me to save and read the context as a file?
#25
mahougigi
closed
2 months ago
13
basic example failed with error 500
#24
cherishh
closed
1 month ago
1
getting a 'load_duration' and sometimes a 'context' error.
#23
Appolonius001
closed
1 month ago
11
The following error was encountered while trying to retrieve the URL: http://127.0.0.1:11434/api/generate
#22
BayJ233
closed
1 month ago
1
导入失败
#21
JioJe
closed
2 months ago
6
Prompt outputs failed validation
#20
karl0ss
closed
4 weeks ago
10
automatic pulling of the model [feature]
#19
bigcat88
opened
3 months ago
1
Add one round chat node using ollama.Client.chat
#18
LingXuanYin
opened
3 months ago
3
Option to remember context between runs
#17
bezo97
closed
3 months ago
0
Error
#16
FemBoxbrawl
closed
3 months ago
1
Update today does not allow to pick model
#15
makeitrad
closed
4 months ago
6
feat: Add Ollama API to get available models
#14
chenpx976
closed
4 months ago
2
Add pyproject.toml for Custom Node Registry
#13
haohaocreates
closed
3 months ago
3
Add Github Action for Publishing to Comfy Registry
#12
haohaocreates
closed
3 months ago
2
add keep_alive option
#11
0xRavenBlack
closed
4 months ago
2
very good tool !!! I can't get away from it, iI've been using it before. However, when I recently opened it, I found that it was particularly slow, presumably because it did not release the video memory, can you add an option to release the video memory?
#10
kenic123
closed
4 months ago
2
Model not found locally, downloading from HuggingFace...
#9
kenic123
closed
1 month ago
4
Thank you!
#8
dicksondickson
closed
4 months ago
0
cannot get llava-llama:8b to work
#7
burritotrex
closed
3 months ago
7
Can you add api timeout? Sometimes it gets stuck.
#6
lldacing
opened
4 months ago
0
可否读取ollama的安装模型列表
#5
smae08
closed
4 months ago
5
Seed input don't take int value
#4
Fictiverse
closed
4 months ago
2
Error in image inversion
#3
liulsg
closed
5 months ago
1
stupid question...
#2
cap-steve
closed
5 months ago
2
Add an input node for Ollama Generate
#1
aaronsb
closed
6 months ago
1