containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
170 stars 31 forks source link

fix: concurrency issue between catalog and inference manager #1216

Closed axel7083 closed 3 months ago

axel7083 commented 3 months ago

What does this PR do?

To fix the concurrency issue, (Inference Manager getting the containers list before the catalog has loaded the models from the files), I changed the InferenceManager to refresh the inferenceServers when the catalog is updated. Making the InferenceServer event base for its initialization.

Screenshot / video of UI

What issues does this PR fix or reference?

Fixes https://github.com/containers/podman-desktop-extension-ai-lab/issues/1215

How to test this PR?