luxonis / depthai

DepthAI Python API utilities, examples, and tutorials.
https://docs.luxonis.com
MIT License
937 stars 232 forks source link

RTSP Output as a Pipeline Node #294

Open Luxonis-Brandon opened 3 years ago

Luxonis-Brandon commented 3 years ago

Start with the why:

In some cases it may be desirable to feed RTSP (Real Time Streaming Protocol, i.e. IP Camera) output to have compatibility with existing software stacks/etc. over Ethernet. To provide instant integration sort of compatibility, particularly if working with closed-source tools (or hard to modify tools) which already work with RTSP-compatible inputs.

This becomes increasingly powerful when the RTSP Output is a node in the pipeline. As instead of the whole video feed being output as a RTSP video stream, any video stream in the pipeline can be output. Take for using an object detector to guide digital PTZ (https://github.com/luxonis/depthai/issues/135), and then outputting this pan/tilt/zoomed stream directly out RTSP.

This way, the RTSP output would appear to a computer, smart phone, YouTube, etc. as if someone is actually just moving a camera around.

Move to the how:

Leverage live555 and the Gen2 Pipeline Builder (#136) to implement an RTSP output node over Ethernet to work with POE-capable DepthAI devices like the BW2098POE and other future DepthAI hardware designs (such as OAK-1-POE and OAK-D-POE) based on the Ethernet-capable BW2099 module.

Move to the what:

Implement a RTSP output node in the Gen2 Pipeline Builder (https://github.com/luxonis/depthai/issues/136). This will allow any node which produces video to be output over Ethernet.

eric-schleicher commented 3 years ago

The zed rtsp examples might be worth reviewing, even though they will work in fundamentally different ways. They setup a meta channel for sending tracking information along with other annotations (like human pose, etc). having this in a node with configurable pass-through for other stuff happening on the module will be excellent.

https://github.com/stereolabs/zed-gstreamer

Luxonis-Brandon commented 3 years ago

Thanks for the heads up!

YijinLiu commented 3 years ago

I like the idea. Before this is ready, can you give some help about how to feed VideoEncoder result into rtsp stream? I tried to do it with gstreamer pipeline

appsrc name=source is-live=true block=true format=GST_FORMAT_TIME emit-signals=false caps=video/x-h264,format=I420,width=3840,height=2160,framerate=30/1 ! rtph264pay config-interval=1 name=pay0 pt=96

and feed data like

pkt = queue.get()
seq = pkt.getSequenceNum()
seconds = pkt.getTimestamp().total_seconds()
data = pkt.getData()
print(f'Sending packet {seq} {len(data)}@{seconds}')
buf = Gst.Buffer.new_allocate(None, len(data), None)
buf.fill(0, data)
buf.duration = self._duration
buf.pts = buf.dts = int(seconds * 1000000)
if self._pts < 0:
            self._pts = buf.pts
buf.offset = buf.pts - self._pts
self._appsrc.emit('push-buffer', buf)

It doesn't work. Always saying

basesrc gstbasesrc.c:3072:gst_base_src_loop: error: Internal data stream error.
0:00:07.970945498   851 0x7f390400e1e0 WARN                 basesrc gstbasesrc.c:3072:gst_base_src_loop: error: streaming stopped, reason not-negotiated (-4)
Luxonis-Brandon commented 3 years ago

Looking. I think we have an example (https://github.com/luxonis/depthai-experiments/tree/master/rtsp-streaming) but I don't think it's been updated to work with POE yet. So I'm giving it a shot first and will update if it can work with POE.

Luxonis-Brandon commented 3 years ago

I confirmed that that is Gen1, so would need to be updated. Looking at WebRTC example (https://github.com/luxonis/depthai-experiments/tree/master/gen2-webrtc-streaming) now.

Luxonis-Brandon commented 3 years ago

Works on USB: image Modifying requirements.txt now to see if it will work on POE.

Luxonis-Brandon commented 3 years ago

image Yes, works over POE. This is from my OG/Black OAK-D-POE prototype (black is bad for direct sunlight, so we changed it to Silver for production).

IMG_6575

Luxonis-Brandon commented 3 years ago

(Note that streaming disparity there seems broken.)

YijinLiu commented 3 years ago

@Luxonis-Brandon Thanks for the quick actions! I looket at https://github.com/luxonis/depthai-experiments/tree/master/rtsp-streaming before posting here. IIUC, that example get frames from depthai and use x264 to encode the video. I am trying to see whether there is a way to use depthai.VideoEncoder's result, streaming it to RTSP directly to avoid encoding the frames using CPU?

Luxonis-Brandon commented 3 years ago

Thanks. Yes I think we should be able to provide an example. Will sync with @VanDavv on Monday.

VanDavv commented 3 years ago

That's a great idea @YijinLiu, thanks! I'll migrate RSTP streaming to Gen2 and see if I can also use VideoEncoder as you suggested

Captain299792458 commented 2 years ago

if I want to just display on the screen, what could be the output sink ?