MinhNgyuen / llm-benchmark

Benchmark llm performance
MIT License
52 stars 15 forks source link

adding functionality to run the benchmark agains ollama instance running in a docker container #3

Open hugokoopmans opened 5 months ago

hugokoopmans commented 5 months ago

Hi,

Thank you for your efforts here, great work.

Any change you could enhance the code so it could also run against a ollama model running in a docker container exposing localhost:11434 ?

Thx

hugo

willybcode commented 1 month ago

@hugokoopmans The maintainer here seems absent. I just added the remote instance feature here https://github.com/willybcode/llm-benchmark You just use --remote [HOST]