AgentOps-AI / agentops

Python SDK for agent monitoring, LLM cost tracking, benchmarking, and more. Integrates with most LLMs and agent frameworks like CrewAI, Langchain, and Autogen
https://agentops.ai
MIT License
1.18k stars 93 forks source link

Ollama Support #237

Closed sprajosh closed 2 weeks ago

sprajosh commented 4 weeks ago

๐Ÿ“ฅ Pull Request

๐Ÿ“˜ Description Add support for Ollama support by patching the ollama.chat function.

๐Ÿ”„ Related Issue (if applicable) Ollama support #192

๐ŸŽฏ Goal Add support for official Ollama python library.

๐Ÿ” Additional Context Any extra information or context to help us understand the change?

๐Ÿงช Testing

import ollama
import agentops

AGENTOPS_API_KEY = "<api-key>"
agentops.init(AGENTOPS_API_KEY)

import ollama

# Sync chat
response = ollama.chat(
    model='orca-mini',
    messages=[{'role': 'user', 'content': 'Why is the sky blue?'}
])

# Async chat
response = ollama.chat(
    model="orca-mini",
    messages=[{"role": "user", "content": "Why is the sky blue?"}],
    stream=True,
)

for chunk in response:
    print(chunk)

agentops.end_session("Success")

This is a first draft. I'd like some feedback to understand if I'm missing something. ~Also, I don't see the analytics on session drill-down view. I will have to check the frontend project as well to see if this is happening because ollama is an unknown event.~

Todo

Dependencies ~https://github.com/AgentOps-AI/tokencost/pull/49 - Ollama support in tokencost to count token from message~ Token cost is calculated on server.

siyangqiu commented 4 weeks ago

It looks like ollama has a streaming mode. I don't know if you want to add support for that

sprajosh commented 4 weeks ago

It looks like ollama has a streaming mode. I don't know if you want to add support for that

Yes, have added support for ollama.chat, ollama.chat with stream, ollama.Client.chat, ollama.AsyncClient.chat.

siyangqiu commented 3 weeks ago

Awesome! Thanks for making the changes. Sorry I wasn't clear enough about token cost! I'll test this again if you can remove tokencost

gitguardian[bot] commented 3 weeks ago

๏ธโœ… There are no secrets present in this pull request anymore.

If these secrets were true positive and are still valid, we highly recommend you to revoke them. Once a secret has been leaked into a git repository, you should consider it compromised, even if it was deleted immediately. Find here more information about risks.


๐Ÿฆ‰ GitGuardian detects secrets in your source code to help developers and security teams secure the modern development process. You are seeing this because you or someone else with access to this repository has authorized GitGuardian to scan your pull request.

siyangqiu commented 3 weeks ago

I just tested this and it works! Good work! ๐ŸŽ‰

I noticed that prompt tokens weren't being counted, but I suspect that's something I need to fix on the API server.