chandeldivyam / samwise

An executive assistant which sits on your computer and help in everyday tasks
MIT License
44 stars 1 forks source link

Issue while generating summary #11

Open dhruv-shipsy opened 2 months ago

dhruv-shipsy commented 2 months ago

Failed to send request to Ollama API: error sending request for url (http://localhost:11434/v1/chat/completions): error trying to connect: tcp connect error: Connection refused (os error 61)

chandeldivyam commented 2 months ago

Hi @dhruv-shipsy, thank you so much.

In the settings section, if we are running ollama on a different port we can use that port.

image

Or else we can change the strategy to gemini if we want to use LLM which is not local and enter the API key.

image