WilliamKarolDiCioccio / open_local_ui

OpenLocalUI: Native desktop app for Windows, MacOS and Linux. Easily run Large Language Models locally, no complex setups required. Inspired by OpenWebUI's simplicity for LLM use.
MIT License
16 stars 2 forks source link

Display token statistics in Chat #50

Closed Rossi1337 closed 2 weeks ago

Rossi1337 commented 2 weeks ago

Add a setting to the settings page Store statistics in the ChatMessages Display statistics in chat widget when turned on in the settings.

The trick was to remove the StringOutputParser from the processing chain. Then we get the ChatReponse objects directly. The last one of those contains the statistics. These I store then directly in the ChatMessage for later retrieval. The Chat widget uses then these statistics to append at the bottom a summary (if activated in the settings).

Rossi1337 commented 2 weeks ago

Fixes #40