Closed Shuaib11-Github closed 1 week ago
Hey, Ollama support was added recently. You can use it by pulling the github repo and installing the manually to get the latest changes.
To install the latest version of agentops, you can follow these steps
git clone git@github.com:AgentOps-AI/agentops.git
cd agentops
pip install -e .
Hey @sprajosh I'm also currently reviewing this bug as well, I'll let you know if there's anything I find
@albertkimjunior I think you misunderstood what I said. I have already added Ollama support. Since the version has not been updated, I was suggesting maybe pull the code from github and install the latest package manually.
Or, is there a bug in the ollama models?
@sprajosh No I understood. I am also looking at the Ollama support pull request that @siyangqiu approved to double-check if there are any possible issues.
Ollama support was merged to the main branch, but it is not currently released to pypi. There is an unrelated bug blocking the release âšī¸. We'll have it released shortly
Ollama support is release in v0.2.5
đ Bug Report
đ Describe the Bug Give a clear and concise description of the bug.
I tried to run the open source model using ollama but in the AgentsOps dashboard it is not posting the llm calls. Can you fix this for any open source LLM used either through ollama or Groq or some other way.
đ Reproduction Steps List the steps to reproduce the behavior.
đ Expected Behavior Describe what you expected to happen.
Integrate the open source LLMs also to post the data of what's happening under the hood.
đ¸ Screenshots If applicable, add screenshots to help explain the problem.
đ Additional Context Provide any other context about the problem here.
Thank you for helping us improve Agentops!