containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
183 stars 41 forks source link

Move the AI Lab listening port in its own configuration page #1978

Closed slemeur closed 5 days ago

slemeur commented 1 month ago

Is your enhancement related to a problem? Please describe

Image

Today, when you are enabling AI Lab, it starts an Ollama compatible endpoint, so a user can programmatically interact with Podman AI Lab.

The problem is this is not very discoverable and the configuration of the endpoint port feels disconnected as well.

Describe the solution you'd like

1 - Remove AI Lab port from the status bar

2 - Create a new page accessible from Podman AI Lab sidebar Name of the page: "AI Lab Service"

In this page, we will provide the following explanation: Integrate Podman AI Lab directly into your development workflows by using its REST API endpoints. Compatible with Ollama's endpoints, you can seamlessly access and utilize the capabilities of Podman AI Lab without relying on its graphical interface

We will add a link to the APIs we are supporting.

Then, we should have a section with "Local Server" and a dedicated page "Server Settings". The first item, will be the Port to listen on.

We should be keeping the possibility to change the Port in the settings of Podman AI Lab extension as well

Describe alternatives you've considered

No response

Additional context

No response

gastoner commented 3 weeks ago

@slemeur I'll try to mimic the preferences design as is in PD to ai-lab Image - There are missing collors + need to create inputs for numbers

Also in the nav on the left, I'm not sure with the title. Maybe something like configuration and below the AI Lab Service? (This is so far kinda of mockup :D )