khoj-ai / khoj

Your AI second brain. Self-hostable. Get answers from the web or your docs. Build custom agents, schedule automations, do deep research. Turn any online or local LLM into your personal, autonomous AI (e.g gpt, claude, gemini, llama, qwen, mistral).
https://khoj.dev
GNU Affero General Public License v3.0
14.24k stars 705 forks source link

Create an easy way for users to share statistics and execution environment #517

Open nickanderson opened 1 year ago

nickanderson commented 1 year ago

From: https://discord.com/channels/1112065956647284756/1112066421577482262/1166885403798798427

I would like to have a way that made it easy to provide feedback about the performance of local llm responses.

I think it could be useful for both the developers and users to be able to easily see information about the performance and quality of the local llm in use.

Information that come to mind:

sabaimran commented 1 year ago

Interesting note! These are definitely relevant performance metrics, and thank you for collating this information. We have the /help endpoint in chat which would output some of this, but could definitely be more detailed.