OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.
Add a setting to the settings page
Store statistics in the ChatMessages
Display statistics in chat widget when turned on in the settings.
The trick was to remove the StringOutputParser from the processing chain. Then we get the ChatReponse objects directly.
The last one of those contains the statistics. These I store then directly in the ChatMessage for later retrieval.
The Chat widget uses then these statistics to append at the bottom a summary (if activated in the settings).
Add a setting to the settings page Store statistics in the ChatMessages Display statistics in chat widget when turned on in the settings.
The trick was to remove the StringOutputParser from the processing chain. Then we get the ChatReponse objects directly. The last one of those contains the statistics. These I store then directly in the ChatMessage for later retrieval. The Chat widget uses then these statistics to append at the bottom a summary (if activated in the settings).