ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.71k stars 223 forks source link

fail to run this example file [examples/multimodal/main.py] #73

Open dabing1205 opened 4 months ago

dabing1205 commented 4 months ago

hello, I failed to run this example after install ollama and llava model.

paste run log here. Please help to take a look on this issue, and if possible, update the example demo in this repo

> {'month': '2', 'num': 2739, 'link': '', 'year': '2023', 'news': '', 'safe_title': 'Data Quality', 'transcript': '', 'alt': "[exclamation about how cute your cat is] -> [last 4 digits of your cat's chip ID] -> [your cat's full chip ID] -> [a drawing of your cat] -> [photo of your cat] -> [clone of your cat] -> [your actual cat] -> [my better cat]", 'img': 'https://imgs.xkcd.com/comics/data_quality.png', 'title': 'Data Quality', 'day': '17'}
> xkcd #2739: [exclamation about how cute your cat is] -> [last 4 digits of your cat's chip ID] -> [your cat's full chip ID] -> [a drawing of your cat] -> [photo of your cat] -> [clone of your cat] -> [your actual cat] -> [my better cat]
> link: https://xkcd.com/2739
> ---
> https://imgs.xkcd.com/comics/data_quality.png
> Traceback (most recent call last):
>   File "/Users/x/Workspace/test_ollama/ollama-python/examples/multimodal/main.py", line 27, in <module>
>     for response in generate('llava', 'explain this comic:', images=[raw.content], stream=True):
>   File "/Users/x/Workspace/test_ollama/lib/python3.9/site-packages/ollama/_client.py", line 68, in _stream
>     raise ResponseError(e.response.text, e.response.status_code) from None
> ollama._types.ResponseError
mxyng commented 4 months ago

Can you describe step by step what you're doing? I'm not able to reproduce this with the latest example, ollama-python, and ollama