avstack / jibri-pod-controller

Isolates Jibri pods from their Deployment when they start recording or livestreaming, and cleans them up when they finish.
Apache License 2.0
12 stars 7 forks source link

Get an issue with starting jibri controller pod #6

Closed lephat08 closed 1 year ago

lephat08 commented 1 year ago

I think I have got issues with Dockerfile because the jibri pod show that the container could not be restarted (back-off) However, running the docker build with the Dockerfile had no error. Could you help me to solve this problem? image image

jbg commented 1 year ago

Look at the logs of the container to see why it is crashing on startup.

lephat08 commented 1 year ago

Look at the logs of the container to see why it is crashing on startup.

actually I could not exec to the container or get log from pod :(

jbg commented 1 year ago

You can't exec to a container that is not running. You can get the logs using kubectl though.

jbg commented 1 year ago

From your screenshot, there do not appear to be any environment variables configured for the jibri-pod-controller container. jibri-pod-controller will be logging an error about the missing environment variables on startup and then exiting. Please have a look at the example deployment manifest in the README.

lephat08 commented 1 year ago

From your screenshot, there do not appear to be any environment variables configured for the jibri-pod-controller container. jibri-pod-controller will be logging an error about the missing environment variables on startup and then exiting. Please have a look at the example deployment manifest in the README.

Maybe I miss understand the documents I will add the environment to the jibri controller Thanks for your help,