containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
145 stars 23 forks source link

racing issue between Catalog and Inference Manager #1215

Closed axel7083 closed 2 weeks ago

axel7083 commented 2 weeks ago

Bug description

Since https://github.com/containers/podman-desktop-extension-ai-lab/pull/1175 we decided to init everything during the activate function of the extension was made (cc @jeffmaury) we now have a problem, where we are often listing all the contains before the catalog has been loaded.

image

An easy fix would be to change the definition of the InferenceServer which has a models property, which is redundant, as we have the model store available for the frontend, and the ModelsManager for the backend.

Operating system

Windows 11

Installation Method

Other

Version

next (development version)

Steps to reproduce

No response

Relevant log output

No response

Additional context

No response

axel7083 commented 2 weeks ago

Okey changing the models property from ModelInfo[] to string[] is a nightmare

image

We need to fix the racing condition then :(