Podman AI Lab is an open source extension for Podman Desktop to work with LLMs (Large Language Models) on a local environment. Featuring a recipe catalog with common AI use cases, a curated set of open source models, and a playground for learning, prototyping and experimentation, Podman AI Lab helps you to quickly and easily get started bringing AI into your applications, without depending on infrastructure beyond your laptop ensuring data privacy and security.
Podman AI Lab uses Podman machines to run inference servers for LLM models and AI applications. The AI models can be downloaded, and common formats like GGUF, Pytorch or Tensorflow are supported.
Podman AI Lab provides a curated list of open source AI models and LLMs. Once downloaded, the models are available to be used for AI applications, model services and playgrounds.
Once a model is downloaded, a model service can be started. A model service is an inference server that is running in a container and exposing the model through the well-known chat API common to many providers.
The integrated Playground environments allow for experimenting with available models in a local environment. An intuitive user prompt helps in exploring the capabilities and accuracy of various models and aids in finding the best model for the use case at hand. The Playground interface further allows for parameterizing models to further optimize the settings and attributes of each model.
Once an AI model is available through a well known endpoint, it's easy to imagine a new world of applications that will connect and use the AI model. Podman AI Lab supports AI applications as a set of containers that are connected together.
Podman AI Lab ships with a so-called Recipes Catalog that helps you navigate a number of core AI use cases and problem domains such as Chat Bots, Code Generators and Text Summarizers. Each recipe comes with detailed explanations and sample applications that can be run with various large language models (LLMs). Experimenting with multiple models allows finding the optimal one for your use case.
OS:
Compatible on Windows, macOS & Linux
Software:
Hardware
LLMs AI models are heavy resource consumers both in terms of memory and CPU. Each of the provided models consumes about 4GiB of memory and requires at least 4 CPUs to run.
So we recommend that a minimum of 12GB of memory and at least 4 CPUs for the Podman machine.
As an additional recommended practice, do nor run more than 3 simultaneous models concurrently.
Please note that this is not relevant for WSL on Windows as the WSL technology the memory and CPU with the host desktop.
You can install the Podman AI Lab extension directly inside of Podman Desktop.
Go to Extensions > Catalog > Install Podman AI Lab.
To install a development version, use the Install custom...
action as shown in the recording below.
The name of the image to use is ghcr.io/containers/podman-desktop-extension-ai-lab
. You can get released tags for the image at https://github.com/containers/podman-desktop-extension-ai-lab/pkgs/container/podman-desktop-extension-ai-lab.
Let's select a model from the catalog and download it locally to our workstation.
Once a model is available locally, let's start an inference server
Want to help develop and contribute to Podman AI Lab?
You can use pnpm watch --extension-folder
from the Podman Desktop directory to automatically rebuild and test the AI Lab extension:
Note: make sure you have the appropriate pre-requisits installed.
git clone https://github.com/containers/podman-desktop
git clone https://github.com/containers/podman-desktop-extension-ai-lab
cd podman-desktop-extension-ai-lab
corepack enable pnpm
pnpm install
pnpm build
cd ../podman-desktop
pnpm watch --extension-folder ../podman-desktop-extension-ai-lab/packages/backend
If you are live editing the frontend package, from packages/frontend folder:
$ pnpm watch
We'll be adding a way to let a user cleanup their environment: see issue https://github.com/containers/podman-desktop-extension-ai-lab/issues/469. For the time being, please consider the following actions:
$HOME/podman-desktop/ai-lab
The extension provides by default a curated list of recipes, models and categories. However, this system is extensible and you can define your own.
To enhance the existing catalog, you can create a file located in the extension storage folder $HOME/.local/share/containers/podman-desktop/extensions-storage/redhat.ai-lab/user-catalog.json
.
It must follow the same format as the default catalog in the sources of the extension.
:information_source: The default behaviour is to append the items of the user's catalog to the default one.
:warning: Each item (recipes, models or categories) has a unique id, when conflict between the default catalog and the user one are found, the user's items overwrite the defaults.
Sample applications may be added to the catalog. See packaging guide for detailed information.
The roadmap is always open and we are looking for your feedback. Please create new issues and upvote on the issues that are feeling the most important for you.
We will be working on the following items:
You can provide your feedback on the extension with this form or create an issue on this repository.