comet-ml / opik

Open-source end-to-end LLM Development Platform
Apache License 2.0
1.36k stars 73 forks source link

[FR]: Support for Local Models #272

Closed Mr-Moonsilver closed 1 week ago

Mr-Moonsilver commented 1 week ago

Willingness to contribute

No. I can't contribute this feature at this time.

Proposal summary

Provide support for local models via Ollama or LM Studio.

Motivation

Evaluating the performance of locally deployed models could add a lot of value. Evaluating the performance of a fine-tune or within a custom workflow would make this tool super valuable for local prototyping and testing.

jverre commented 1 week ago

Hey @Mr-Moonsilver

This is something we are actively looking into, how are you using Ollama today ? Are you using it from the command line or through the Python SDK for example ?

Mr-Moonsilver commented 1 week ago

@jverre that is really exciting indeed! I'm using it almost exclusively through the python sdk. I am hosting models locally, some are fine tuned, and using the ollama API to interact with these models.

jverre commented 1 week ago

Thanks @Mr-Moonsilver, let me take a look and get back to you

jverre commented 1 week ago

Hey @Mr-Moonsilver I took a look and found 3 different ways to integration Ollama with Opik ! I've created a new documentation page with more information about these here: https://www.comet.com/docs/opik/tracing/integrations/ollama

Let me know what you think

Mr-Moonsilver commented 1 week ago

I think this is amazing. Thank you very much!