containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
172 stars 31 forks source link

Provide Performance metrics for Model Service #495

Open axel7083 opened 6 months ago

axel7083 commented 6 months ago

I think it would be a nice touch to show a proper histogram for each resource when the inference server is running

Here is what I was really liking in Lens, being able to see clearly the CPU/RAM usage, seeing pics etc.

image

Since we have https://github.com/containers/podman-desktop/pull/6212 merged, we could keep a few minutes of history for some stats and display it for the user

axel7083 commented 6 months ago

Here is a POC of a monitoring component that could be placed above the inference container details

https://github.com/projectatomic/ai-studio/assets/42176370/5af73ff9-eeab-49aa-8d2e-1d26cc794397

MariaLeonova commented 2 weeks ago

@axel7083 This looks good! How would the user interact with the graph? Would they check the exact number? WIll they need a report produced? Can you please provide a screenshot of the field that appears on hover (click?).

This is the best I was able to catch in the video: Screenshot 2024-09-16 at 9 59 57

axel7083 commented 2 weeks ago

The branch is 701 commits behind and was a POC, not really something definitive !

https://github.com/axel7083/ai-studio/tree/feature/showing-ram-usage-history

I would not take it as a production ready example. The POC was using @carbon/charts-svelte package, some examples of the library capabilities can be found here https://charts.carbondesignsystem.com/introduction