Open hugokoopmans opened 5 months ago
Hi,
Thank you for your efforts here, great work.
Any change you could enhance the code so it could also run against a ollama model running in a docker container exposing localhost:11434 ?
Thx
hugo
@hugokoopmans The maintainer here seems absent. I just added the remote instance feature here https://github.com/willybcode/llm-benchmark You just use --remote [HOST]
--remote
Hi,
Thank you for your efforts here, great work.
Any change you could enhance the code so it could also run against a ollama model running in a docker container exposing localhost:11434 ?
Thx
hugo