fixie-ai / thefastest.ai

Website with current metrics on the fastest AI models.
MIT License
30 stars 3 forks source link

how can my inference platform join the benchmark? #34

Open nickwind opened 1 month ago

nickwind commented 1 month ago

hello,I have setup a inference platform with more than 100 GPUS which can provide inference service for prevalent llm, I want to join this benchmark ,so how can I do it?

juberti commented 1 month ago

You can make a PR like this one for DeepInfra: https://github.com/fixie-ai/ai-benchmarks/pull/82