containers / podman-desktop-extension-ai-lab

Work with LLMs on a local environment using containers
https://podman-desktop.io/extensions/ai-lab
Apache License 2.0
175 stars 34 forks source link

`http://host.containers.internal:<port>` do not work on podman native linux #1834

Open axel7083 opened 3 weeks ago

axel7083 commented 3 weeks ago

Bug description

We are using http://host.containers.internal:<port> for the MODEL_ENDPOINT since we removed the inference server of the Pod of the recipe. The solution was introduced in https://github.com/containers/podman-desktop-extension-ai-lab/pull/1503

Operating system

Fedora 40

Installation Method

Other

Version

next (development version)

Steps to reproduce

  1. start a recipe on podman native linux
  2. assert stuck on check model availability

Relevant log output

No response

Additional context

cc @jeffmaury

jeffmaury commented 3 weeks ago

Should be an easy fix: replace MODEL_ENDPOINT to use inference_server_container_ip:8000

axel7083 commented 1 week ago

I am not able to make a request from a container A to another container B using the ip of the container B I opened https://github.com/containers/podman/issues/24260 to have some background and more information

axel7083 commented 1 week ago

Following the response from podman, should be fixed in 5.3, keeping this open for now