aidatatools / ollama-benchmark

LLM Benchmark for Throughput via Ollama (Local LLMs)
https://llm.aidatatools.com/
MIT License
64 stars 13 forks source link

Need to report tokens/sec #1

Closed dbabokin closed 5 months ago

dbabokin commented 5 months ago

With temperature not set to 0, the result may and will vary from run to run. Better way to report the performance is to report tokens/sec.

dbabokin commented 5 months ago

Ahh, I checked test_llm.py and didn't see token/s and reported this. Now I see that query_llm.py reports this metric. So, not a problem then.