Open SebastianOlthuis opened 1 week ago
never mind idk why but it didnt give me a error the automation even said it was used 2-3 sec ago constandly but after few min this again
I think something else is using 3333 port in your server. So, despite of the unknown service error, your --publish 3333:3332
argument is still wrong.
The publish argument values are in --publish <host_port>:<container_port>
format. So, it should be --publish 3332:3333
instead.
The ML detection backend should run in 3333 port in the docker container but it should bind to 3332 in your host server.
I think something else is using 3333 port in your server. So, despite of the unknown service error, your
--publish 3333:3332
argument is still wrong.The publish argument values are in
--publish <host_port>:<container_port>
format. So, it should be--publish 3332:3333
instead. The ML detection backend should run in 3333 port in the docker container but it should bind to 3332 in your host server.
thanks for the fast reply. saldy it didnt fix it
Maybe this is stupid but could this be the issue ? because thats not even same subnet as my network
You should use 3332 port in Obico ML API Host
config. The ML API runs with 3333 port within the docker container, but everything else outside the container can access the ML API via 3332 port since you bind it to 3332 port using --publish 3332:3333
argument.
You should use 3332 port in
Obico ML API Host
config. The ML API runs with 3333 port within the docker container, but everything else outside the container can access the ML API via 3332 port since you bind it to 3332 port using--publish 3332:3333
argument.
i tried this already after you last reply because i wanted to make sure im not mixing something up so i tried both 3332 and 3333 in the config. by config i mean the automatison from your blueprint.
in the config from the addon i cant use 3333 so im on 3332 because otherwise ill get the already use by host error again.
I think this host configuration is correct, but it seems you are getting "unknown service" error. Have you installed the HA Integration in step 2 (Not the blueprint or docker container)?
yes i did but can redo it real quick by removing blueprint&conftainer real quick
Can you check whether the following entities are available in your system:
Also the predict service:
they where not i checked as soon i saw the service error for the first time.
But i have great news the complete redoing it worked for some reason probalby my mistake.
also see the service now (it should stay empty i assume)
I wanted to install this on Home Assistant OS because i didint wanna reinstall. My issue i had was when the add-on was running, I can't start the Docker container on port 3333 because it says the port is already in use on the host.
Conversely, if I stoped the add-on, I can start the Docker container just fine. However, if I then try to start the add-on, I encounterd the same problem.
To address this, I uninstalled the docker container and installed it with a changed port like this:
By using SSH into Home Assistant OS, I was able to install it without a typical Docker installation. Hopefully, this helps someone facing a similar issue saw a few posts online when first looking into this.