mpromonet / webrtc-streamer

WebRTC streamer for V4L2 capture devices, RTSP sources and Screen Capture
https://webrtcstreamer.agreeabletree-365b9a90.canadacentral.azurecontainerapps.io/?layout=2x2
The Unlicense
2.94k stars 598 forks source link

webrtc container errrors on no decoder found #573

Closed shaunmitchellve closed 1 year ago

shaunmitchellve commented 1 year ago

Describe the bug

When running webrtc-streamer container in a Kubernetes deployment (also get the same error on Docker Mac Desktop) I get an error when it connects to the rtsp IP camera:

Error:

[071:130][64] (VideoDecoder.h:228): VideoDecoder::DecoderThread no decoder [071:180][64] (VideoDecoder.h:228): VideoDecoder::DecoderThread no decoder [071:210][64] (VideoDecoder.h:228): VideoDecoder::DecoderThread no decoder [071:259][64] (VideoDecoder.h:228): VideoDecoder::DecoderThread no decoder [071:289][65] (livevideosource.h:124): cannot parse sps NOTIFY failed

When I run webrtc-streamer on my mac laptop and connect to it using the webrtc web component it works awesome!

To Reproduce I created a configmap to create the config file:

apiVersion: v1
kind: ConfigMap
metadata:
  name: webrtc-config
data:
  config.json: |
    {
      "urls":{
        "localcam": { "options": "rtptransport=tcp&timeout=60", "video": "rtsp://USER:PASSWORD@172.16.0.x:88/videoMain"}
      }
    }

Then created the deployment:

apiVersion: apps/v1
kind: Deployment
metadata:
 name: webrtc-streamer
 namespace: default
 labels:
   app: streamer
spec:
 replicas: 1
 selector:
   matchLabels:
     app: streamer
 template:
   metadata:
     labels:
       app: streamer
   spec:
     containers:
       - name: webrtc-streamer
         image: mpromonet/webrtc-streamer
         ports:
           - containerPort: 8000
         args: ["-C", "/webrtc-streamer/config.json", "-v"]
         volumeMounts:
          - mountPath: /webrtc-streamer
            name: webrtc-streamer-config
     volumes:
     - name: webrtc-streamer-config
       configMap:
        name: webrtc-config

Created a service for ingress:

apiVersion: v1
kind: Service
metadata:
  name: node-port-svc-streamer
spec:
  type: NodePort
  selector:
    app: streamer
  ports:
  - protocol: TCP
    port: 8000
    targetPort: 8000
    nodePort: 30001

I used the webrtc web component to connect:

<webrtc-streamer url="localcam" options="rtptransport=tcp&timeout=60" webrtcurl="http://172.16.x.x:30001"></webrtc-streamer>

Expected behavior

webrtc-streamer should connect and decode the stream like it does when running directly on my mac laptop.

Additional context

I have confirmed that I can reach the rtsp camera from the K8s cluster / pod by ping'ing the IP address.

shaunmitchellve commented 1 year ago

I increased logging to check to see if webrtc-streamer can see the camera:

[054:444][13] (PeerConnectionManager.cpp:1247): Adding Stream to map
Created new TCP socket [054:444][10] (rtp_transmission_manager.cc:189): Reusing an existing video transceiver for AddTrack.
11 for connection
Connecting to 172.16.0.X port 88 on socket 11...
[054:444][13] (PeerConnectionManager.cpp:1273): VideoTrack added to PeerConnection
[054:444][13] (PeerConnectionManager.cpp:1281): Cannot create capturer audio:localcam
...remote connection opened
Sending request: DESCRIBE rtsp://shaun:camera1@172.16.0.193:88/videoMain RTSP/1.0
CSeq: 2
User-Agent: LIVE555 Streaming Media v2023.03.30
Accept: application/sdp

[054:447][10] (media_session.cc:951): RED codec red is missing an associated payload type.
[054:449][10] (media_session.cc:951): RED codec red is missing an associated payload type.
[054:450][10] (PeerConnectionManager.h:128): virtual void PeerConnectionManager::CreateSessionDescriptionObserver::OnSuccess(webrtc::SessionDescriptionInterface*) type:answer sdp:v=0

One thing I noticed while looking at the logs is that the basic_port_allocator.cc sees 3 networks:

[054:453][11] (basic_port_allocator.cc:771): Count of networks: 3
[054:453][11] (basic_port_allocator.cc:773): Net[eth0:192.168.0.x/32:Ethernet:id=1]
[054:453][11] (basic_port_allocator.cc:773): Net[lo:0:0:0:x:x:x:x:x/128:Loopback:id=3]
[054:453][11] (basic_port_allocator.cc:773): Net[lo:127.0.0.x/8:Loopback:id=2]

These are the pods networks and it seems that it sets itself up to listen for UDP on the pods network id=1. However there is no ingress for udp setup on the kubernetes service so not sure if this would cause an issue?

There is a response to the PLAY request:

Received a complete PLAY response:
RTSP/1.0 200 OK
CSeq: 6
Date: Wed, Apr 05 2023 22:40:47 GMT
Range: npt=0.000-
Session: 1A7F7B9B
RTP-Info: url=rtsp://172.16.0.X:65534/videoMain/track1;seq=33998;rtptime=1268361422,url=rtsp://172.16.0.X:65534/videoMain/track2;seq=34466;rtptime=3785475019

[054:591][65] (livevideosource.h:124): cannot parse sps
NOTIFY failed
[054:600][64] (VideoDecoder.h:228): VideoDecoder::DecoderThread no decoder
shaunmitchellve commented 1 year ago

I've tried adding the local STUN and TURN servers to the deployment using the external cluster IP address:

"-S0.0.0.0:3478", "-s172.16.0.x:30002", "-T0.0.0.0:3479", "-tturn:turn@172.16.0.x:30003"

This sets up properly and tweaking the service to accept the nodeports and direct to the pod port works. However, ICE server is still setting up the candidate as the pods local IP which isn't routable outside the pod itself so I have feeling that the web component can't reach the webrtc-streamer service as it's trying to connect to the pods 192.168.x.x IP address.

I could be totally wrong and going down the wrong path here.

mpromonet commented 1 year ago

Hi The initial error serms 'cannot parse sps', you should probably look to th SDP sent by rtsp source Best Regards Michel

shaunmitchellve commented 1 year ago

Thanks Michel,

After much further research I've discovered this has nothing to with the rtsp source but everything to do with the Webrtc protocol, STUN and TURN server processes. Kubernetes uses a number of NAT / SNATs with in the system and Webrtc isn't made to handle that may IP translations. By the time the client UDP packet reaches the webrtc-streamer it has no identification information about the client left. So, peer to peer connections become impossible.

Others are creating ways to run this in Kubernetes but it requires a bunch of custom objects in the cluster. Nothing to do with webrtc-streamer and it's awesomeness, just protocol stuff. Thanks and I will close this issue.