nberktumer / ha-bambu-lab-p1-spaghetti-detection

Bambu Lab P1 Home Assistant Spaghetti Detection Integration.
GNU General Public License v3.0
92 stars 6 forks source link

Install on Home Assistant OS #16

Open SebastianOlthuis opened 1 week ago

SebastianOlthuis commented 1 week ago

I wanted to install this on Home Assistant OS because i didint wanna reinstall. My issue i had was when the add-on was running, I can't start the Docker container on port 3333 because it says the port is already in use on the host.

image

Conversely, if I stoped the add-on, I can start the Docker container just fine. However, if I then try to start the add-on, I encounterd the same problem.

image

To address this, I uninstalled the docker container and installed it with a changed port like this:

docker create \
  --restart unless-stopped \
  --env ML_API_TOKEN=obico_api_secret \
  --publish 3333:3332 \
  --name ha_bambu_lab_p1_spaghetti_detection \
  nberk/ha_bambu_lab_p1_spaghetti_detection_standalone:latest
docker create \
  --restart unless-stopped \
  --env ML_API_TOKEN=obico_api_secret \
  --publish 3333:3332 \
  --name ha_bambu_lab_p1_spaghetti_detection \
  nberk/ha_bambu_lab_p1_spaghetti_detection_standalone:latest

image

By using SSH into Home Assistant OS, I was able to install it without a typical Docker installation. Hopefully, this helps someone facing a similar issue saw a few posts online when first looking into this.

SebastianOlthuis commented 1 week ago

never mind idk why but it didnt give me a error the automation even said it was used 2-3 sec ago constandly but after few min this again image

nberktumer commented 1 week ago

I think something else is using 3333 port in your server. So, despite of the unknown service error, your --publish 3333:3332 argument is still wrong.

The publish argument values are in --publish <host_port>:<container_port> format. So, it should be --publish 3332:3333 instead. The ML detection backend should run in 3333 port in the docker container but it should bind to 3332 in your host server.

SebastianOlthuis commented 1 week ago

I think something else is using 3333 port in your server. So, despite of the unknown service error, your --publish 3333:3332 argument is still wrong.

The publish argument values are in --publish <host_port>:<container_port> format. So, it should be --publish 3332:3333 instead. The ML detection backend should run in 3333 port in the docker container but it should bind to 3332 in your host server.

thanks for the fast reply. saldy it didnt fix it image

image

image

SebastianOlthuis commented 1 week ago

Maybe this is stupid but could this be the issue ? because thats not even same subnet as my network

ip

nberktumer commented 1 week ago

You should use 3332 port in Obico ML API Host config. The ML API runs with 3333 port within the docker container, but everything else outside the container can access the ML API via 3332 port since you bind it to 3332 port using --publish 3332:3333 argument.

SebastianOlthuis commented 1 week ago

You should use 3332 port in Obico ML API Host config. The ML API runs with 3333 port within the docker container, but everything else outside the container can access the ML API via 3332 port since you bind it to 3332 port using --publish 3332:3333 argument.

i tried this already after you last reply because i wanted to make sure im not mixing something up so i tried both 3332 and 3333 in the config. by config i mean the automatison from your blueprint.

image

in the config from the addon i cant use 3333 so im on 3332 because otherwise ill get the already use by host error again.

nberktumer commented 1 week ago

I think this host configuration is correct, but it seems you are getting "unknown service" error. Have you installed the HA Integration in step 2 (Not the blueprint or docker container)?

SebastianOlthuis commented 1 week ago

yes i did but can redo it real quick by removing blueprint&conftainer real quick

nberktumer commented 1 week ago

Can you check whether the following entities are available in your system:

image

Also the predict service:

image
SebastianOlthuis commented 1 week ago

they where not i checked as soon i saw the service error for the first time.

But i have great news the complete redoing it worked for some reason probalby my mistake. image

also see the service now (it should stay empty i assume)