containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
182 stars 39 forks source link

Document that the model serving in ai-lab.yaml is optional #1882

Open feloy opened 1 month ago

feloy commented 1 month ago

Is your enhancement related to a problem? Please describe

In the recipes catalog, if a backend is specified for a recipe, the internal inference serving will be used instead.

Describe the solution you'd like

It should be documented that, for this reason, the container for model serving is not required in the ai-lab.yaml of the recipe

Describe alternatives you've considered

No response

Additional context

No response