Closed Rossi1337 closed 2 weeks ago
I'll do some research on that. I don't think LangChain allows for something similar as it goes beyond the scope of the library. I'll let you know
@Rossi1337 I think the only solutions would be either to use LangSmith (hard limit set to 5000 traces for free tier) or to calculate our own statistics in app. In case we choose LangSmith we might specify its own API token for tracing. Let me know which solution would you prefer
I´m interested in getting some performance statistics when I download a new model and test this. In Ollama you can do this with the following command
ollama run llama3 --verbose
That will print after each response some additional statistics. Would be nice to have something like this also available in the UI. Add a option to the Settings -> Ollama -> Show Statistics When activated pass this to Ollama. Not sure if this is supported via langchain and the API. I hope this is somehow supported. Additionally we could write an own markup handler for that info to render it differently or make it collapsible.
Maybe that info is already there in the API response and we need to just find a way to add this in a non intrusive way to the UI. Then we could skip the setting toggle for this and have this always per default.