issues
search
taketwo
/
llm-ollama
LLM plugin providing access to local Ollama models using HTTP API
Apache License 2.0
89
stars
6
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Support for embedding models installed through Ollama
#14
javierhb23
opened
1 month ago
1
Support for remote ollama server
#13
dpatschke
closed
1 month ago
5
Expose `format="json"`?
#12
jefftriplett
closed
1 month ago
4
Change type of stop sequences option to list
#11
Mrmaxmeier
closed
2 months ago
2
Avoid KeyError when iterating through messages
#10
simonw
closed
3 months ago
1
Error: 'message' when executing a streaming prompt
#9
simonw
closed
3 months ago
3
Use try/except to return an empty model list if Ollama is not responding
#8
davedean
closed
3 months ago
1
Add pydantic v2 as a dependency
#7
Taik
closed
3 months ago
1
Error: 'Options' object has no attribute 'model_dump'
#6
shiyongxin
closed
3 months ago
4
Add available modelfile parameters
#5
Taik
closed
4 months ago
1
Implementing `keep_alive` parameter
#4
ibehnam
opened
4 months ago
1
'Connection refused' Error
#3
haje01
closed
6 months ago
2
404 response from ollama on prompt
#2
nhoffman
closed
7 months ago
2
FYI Official Ollama Python Library
#1
easp
closed
7 months ago
2