issues
search
ollama
/
ollama-python
Ollama Python library
https://ollama.com
MIT License
2.71k
stars
223
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
:pray: Feature request > Create model from file path with SDK
#53
adriens
closed
4 months ago
1
create example
#52
mxyng
closed
5 months ago
0
keep_alive control how long models stay loaded
#51
stevengans
closed
5 months ago
1
fix: encode base64 inputs
#50
mxyng
closed
5 months ago
0
Custom models
#49
freQuensy23-coder
closed
5 months ago
1
Is there a way for using chat() function by passing more context ?
#48
unexpand
closed
5 months ago
1
Does chat take options?
#47
tmceld
closed
5 months ago
4
:memo: Documentation request - for fewshot templates model
#46
adriens
closed
4 months ago
1
doc(README) : add prerequisites
#45
adriens
closed
1 month ago
10
Usage body for LLaVA model
#44
sanujenLLM
closed
5 months ago
1
Batching support in Ollama
#43
canamika27
closed
5 months ago
1
bump httpx to 0.26.0
#42
ej52
opened
5 months ago
0
Update README.md
#41
rachfop
closed
5 months ago
1
fix parse modelfile
#40
mxyng
closed
5 months ago
0
function calling
#39
iplayfast
opened
5 months ago
9
How can I use `context` parameter?
#38
YRG999
closed
5 months ago
1
404 on /api/chat
#37
leolivier
closed
5 months ago
1
async call in Jupyter notebook
#36
chenxizhang
closed
4 weeks ago
0
asyncio.run() cannot be called from a running event loop
#35
chenxizhang
closed
5 months ago
1
error handling code
#34
chenxizhang
closed
5 months ago
0
Suggestion: Use models to encapsulate request/responses
#33
sachinsachdeva
opened
5 months ago
1
update github actions
#32
mxyng
closed
5 months ago
0
add keep_alive
#31
mxyng
closed
5 months ago
0
Error: Head "http://127.0.0.1:11434/": read tcp 127.0.0.1:36136->127.0.0.1:11434: read: connection reset by peer
#30
wildcat7534
closed
5 months ago
1
README doesn't mention that a running ollama server is required
#29
jmccrosky
opened
5 months ago
5
Made the API more readable.
#28
Red-exe-Engineer
closed
5 months ago
2
ConnectError
#27
morteza-rp
closed
5 months ago
1
Add example using seed param to get consistent outputs
#26
mrkiura
opened
5 months ago
3
Batching
#25
varunshenoy
closed
5 months ago
3
Clarification: Does this install ollama or just act as an API to an existing install?
#24
ganakee
closed
5 months ago
2
Is there a command that just serves the REST api , similar to ollama run on the cli
#23
gregnwosu
closed
5 months ago
2
GUI using Gradio or Pygame or Streamlit
#22
MostlyKIGuess
closed
5 months ago
2
ollama.generate和ollama.chat的区别是什么?
#21
haomes
closed
5 months ago
2
fix pull example typo
#20
BruceMacD
closed
5 months ago
0
upgrade pillow
#19
mxyng
closed
5 months ago
0
fix unit tests
#18
mxyng
closed
5 months ago
0
update response error field to match json response
#17
mxyng
closed
5 months ago
0
fix: update OLLAMA_HOST parsing to match ollama CLI
#16
mxyng
closed
5 months ago
0
examples
#15
mxyng
closed
5 months ago
0
Server error '502 Bad Gateway' for url 'http://127.0.0.1:11434/api/chat'
#14
RussellXY
closed
5 months ago
2
add example for pulling with progress bar
#13
mxyng
closed
5 months ago
0
httpx.ConnectError: [Errno 111] Connection refused
#12
asmith26
closed
5 months ago
3
ci: give publish job content perms
#11
mxyng
closed
5 months ago
0
Changed 'show' to use POST instead of GET
#10
easp
closed
5 months ago
0
s/target/destination/
#9
mxyng
closed
5 months ago
0
fix: remote create new file
#8
BruceMacD
closed
5 months ago
0
fix gh release upload
#7
mxyng
closed
5 months ago
0
fix publish
#6
mxyng
closed
5 months ago
0
Mxyng
#5
mxyng
closed
5 months ago
0
Mxyng/rm default async
#4
mxyng
closed
5 months ago
0
Previous
Next