Open Anchang8 opened 4 days ago
Hi @Anchang8 - we don't currently have this feature through models running locally from Hugggingface, but we can run local LLMs as feedback providers using Ollama (via LiteLLM) if that would meet your needs.
Here's an example notebook:
I want to use the local model downloaded from Hugging Face as the provider of Trulens feedback for the RAG Evaluation project. If this feature is already available, how can I use it? Thank you.