Closed ishaan-jaff closed 7 months ago
@ishaan-jaff Hi, thanks for your great suggestion. However, currently we have very tight time schedule and would love to review your pull request if you can make contribution to this project. Thanks again for your interest!
Is your feature request related to a problem? Please describe. Easily evaluate 100+ llms fast
Hi @Longin-Yu @HenryCai11 I'm the maintainer of LiteLLM. we allow you to create a proxy server to call 100+ LLMs to make it easier to run benchmark / evals .
I'm making this issue because I believe LiteLLM makes it easier for you to run benchmarks and evaluate LLMs (I'd love your feedback if it does not)
Try it here: https://docs.litellm.ai/docs/simple_proxy https://github.com/BerriAI/litellm
Using LiteLLM Proxy Server
Creating a proxy server
Ollama models
Hugging Face Models
Anthropic
Palm
Using to run an eval on lm harness: