AgentOps-AI / agentops

Python SDK for agent monitoring, LLM cost tracking, benchmarking, and more. Integrates with most LLMs and agent frameworks like CrewAI, Langchain, and Autogen
https://agentops.ai
MIT License
1.18k stars 93 forks source link

Open Source LLMs using ollama #265

Closed Shuaib11-Github closed 1 week ago

Shuaib11-Github commented 1 week ago

🐛 Bug Report

🔎 Describe the Bug Give a clear and concise description of the bug.

I tried to run the open source model using ollama but in the AgentsOps dashboard it is not posting the llm calls. Can you fix this for any open source LLM used either through ollama or Groq or some other way.

🔄 Reproduction Steps List the steps to reproduce the behavior.

🙁 Expected Behavior Describe what you expected to happen.

Integrate the open source LLMs also to post the data of what's happening under the hood.

📸 Screenshots If applicable, add screenshots to help explain the problem.

🔍 Additional Context Provide any other context about the problem here.

Thank you for helping us improve Agentops!

sprajosh commented 1 week ago

Hey, Ollama support was added recently. You can use it by pulling the github repo and installing the manually to get the latest changes.

To install the latest version of agentops, you can follow these steps

  1. Clone the project
    git clone git@github.com:AgentOps-AI/agentops.git
  2. cd agentops
  3. pip install -e .
albertkimjunior commented 1 week ago

Hey @sprajosh I'm also currently reviewing this bug as well, I'll let you know if there's anything I find

sprajosh commented 1 week ago

@albertkimjunior I think you misunderstood what I said. I have already added Ollama support. Since the version has not been updated, I was suggesting maybe pull the code from github and install the latest package manually.

Or, is there a bug in the ollama models?

albertkimjunior commented 1 week ago

@sprajosh No I understood. I am also looking at the Ollama support pull request that @siyangqiu approved to double-check if there are any possible issues.

siyangqiu commented 1 week ago

Ollama support was merged to the main branch, but it is not currently released to pypi. There is an unrelated bug blocking the release ☚ī¸. We'll have it released shortly

siyangqiu commented 1 week ago

Ollama support is release in v0.2.5