stephenturner / biorecap

Retrieve and summarize bioRxiv preprints with a local LLM using ollama
https://stephenturner.github.io/biorecap/
Other
61 stars 7 forks source link

Ollama local server not running or wrong server #3

Closed Michael-Geuenich closed 2 months ago

Michael-Geuenich commented 2 months ago

Hi,

thank you for building this package, I'm curious to try it out. Unfortunately, I can't seem to be able to get it running. I've

  1. installed biorecap and can load it with library(biorecap) without issue
  2. I've downloaded and installed ollama, which seems to run as a command line tool:
    ollama run llama3.1
    >>> test
    It looks like you're just testing the chat. How can I assist you today?
  3. However, when I run the test correction function in RStudio I get the following error:
    > test_connection()
    Ollama local server not running or wrong server.
    Download and launch Ollama app to run the server. Visit https://ollama.com or https://github.com/ollama/ollama
    <httr2_request>
    GET http://localhost:11434
    Body: empty
    Warning messages:
    1: In curl_system_version() : restarting interrupted promise evaluation
    2: In curl_system_version() : internal error -3 in R_decompress1

Thank you!

stephenturner commented 2 months ago

Hm... that's definitely an issue with the ollamar package, or how your R is talking to ollama. I'd suggest searching through the issues there: https://github.com/hauselin/ollama-r/issues?q=is%3Aissue+is%3Aclosed. The docs don't offer much additional advice for encountering this error: https://hauselin.github.io/ollama-r/#notes