nberktumer / ha-bambu-lab-p1-spaghetti-detection

Bambu Lab P1 Home Assistant Spaghetti Detection Integration.
GNU General Public License v3.0
93 stars 6 forks source link

Use this integration with default Obico installation? #5

Closed N1c093 closed 2 months ago

N1c093 commented 4 months ago

Hello,

is it possible to use this integration with the default Obico installation? See this guide: https://www.obico.io/docs/server-guides/install/ I can't see an option to create/set the required api key.

What is the difference between the official docker installation from Obico and your docker container?

Thank you

nberktumer commented 4 months ago

I am not quite familiar with the default Obico installation but I've checked the Obico code and found that it is using the same environment variable:

https://github.com/TheSpaghettiDetective/obico-server/blob/49b9ff207af31db4dc6784f760a3536aa1b76552/backend/config/settings.py#L441C1-L442C1

I'm not sure if you can configure this token via Obico panel though. However, if you have the default Obico installation and know the token, you can use its ML API server with this integration. Just make sure to expose 3333 port for ML API container . I think by default it is only accessible within the Obico docker network.

This repo contains a fork of Obico ML API codes and it has very minor changes such as exposing port so that the API server can be accessible from HA.

ForceConstant commented 4 months ago

@N1c093 did you get this working?

N1c093 commented 4 months ago

@ForceConstant Sadly not. I setup my Obico instance on Oracle Cloud with this guide: https://github.com/MallocArray/obico-oraclecloud As I'm not familiar with Oracle Cloud, I don't know how I can add the required environment variable.

ForceConstant commented 3 months ago

So wanted to let you know that this is working fine using the standard installation of obico in docker. Specifically this project only uses the ML service, and doesn't communicated with the web/task so you can leave that out. Here is my docker-compose.

services: redis: restart: unless-stopped image: redis:7.2-alpine ml_api: restart: unless-stopped image: ghcr.io/thespaghettidetective/obico-server/ml:0.1.0-release.1.4241 environment: