BreeeZe / rpos

Raspberry Pi Onvif Server
http://breeeze.github.io/rpos
MIT License
647 stars 147 forks source link

Present mjpeg stream instead of h264 #56

Open ghost opened 5 years ago

ghost commented 5 years ago

I have a project where I would like to present the video as mjpeg over http. I know the info is in the Onvif standard how to do this. But its alot of info to go thru. Perhaps someone here could point out some directions or fill in some additional info?

RogerHardiman commented 5 years ago

The starting point is GetProfiles and GetProfile. Look at the XML that these return.

They return a Video Profile which ties up 3 other data structures VideoSourceConfiguration (eg the camera and the camera resolution) VideoEncoderConfiguration (eg H264 or MJPEG) PTZConfiguration (PTZ related stuff)

So you will need to adapt the VideoEncoderConfiguration to describe some MJPEG capabilities.

Then you will need to sort out the RTSP server so it streams mjpeg video.

RogerHardiman commented 5 years ago

It would be really nice if we could implement H264 and MJPEG together. The problem currently is that we use the v4l2 driver for the Raspberry Pi cameras and that driver can only deliver h264 or mjpeg (and not both). It would be nice if we could get raw YUV from the camera and then feed into a H264 encoder and a MJPEG encoder and stream both of those. It looks like that may be possible building on work from @Schwaneberg to use GStreamer as the video pipeline. I believe there are some GStreamer plugins to access the Pi Hardware H264 encoding so performance with HD could be maintained. This could also get us a JPEG Snapshot (another ONVIF command) that is currently implemented by a ffmpeg hack.

But a starting point is probably a flag in rposConfig.json for H264 or MJPEG and then running one codec or the other.

ghost commented 5 years ago

Thanks, i will look into it.

On Wed, Jan 16, 2019 at 1:30 PM Roger Hardiman notifications@github.com wrote:

It would be really nice if we could implement H264 and MJPEG together. The problem currently is that we use the v4l2 driver for the Raspberry Pi cameras and that driver can only deliver h264 or mjpeg (and not both). It would be nice if we could get raw YUV from the camera and then feed into a H264 encoder and a MJPEG encoder and stream both of those. It looks like that may be possible building on work from @Schwaneberg https://github.com/Schwaneberg to use GStreamer as the video pipeline. I believe there are some GStreamer plugins to access the Pi Hardware H264 encoding so performance with HD could be maintained. This could also get us a JPEG Snapshot (another ONVIF command) that is currently implemented by a ffmpeg hack.

But a starting point is probably a flag in rposConfig.json for H264 or MJPEG and then running one codec or the other.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/BreeeZe/rpos/issues/56#issuecomment-454762429, or mute the thread https://github.com/notifications/unsubscribe-auth/AG85yTsO54gomyPp86pWaf8RqMdh330lks5vDxt2gaJpZM4aC70r .

Schwaneberg commented 5 years ago

Hello! Yes, it is possible to encode the raw stream with hardware accelerated encoders in GStreamer, but I do not recommend this. I tried it once and the quality is lower compared to the rpicamsrc encoder. The delay is also much higher (~180ms to >500ms). So, I recommend to disable the v4l2 interface and use native rpicamsrc instead. Then, we can switch between MJPEG, RAW and H264 just be rebuilding the GStreamer pipeline.

I agree that introducing a field for the codec would be a good starting point. Lets call it "Codec" with supported values "raw", "mjpeg" and "h264". Ok?

Am Mi., 16. Jan. 2019 um 13:30 Uhr schrieb Roger Hardiman < notifications@github.com>:

It would be really nice if we could implement H264 and MJPEG together. The problem currently is that we use the v4l2 driver for the Raspberry Pi cameras and that driver can only deliver h264 or mjpeg (and not both). It would be nice if we could get raw YUV from the camera and then feed into a H264 encoder and a MJPEG encoder and stream both of those. It looks like that may be possible building on work from @Schwaneberg https://github.com/Schwaneberg to use GStreamer as the video pipeline. I believe there are some GStreamer plugins to access the Pi Hardware H264 encoding so performance with HD could be maintained. This could also get us a JPEG Snapshot (another ONVIF command) that is currently implemented by a ffmpeg hack.

But a starting point is probably a flag in rposConfig.json for H264 or MJPEG and then running one codec or the other.

— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/BreeeZe/rpos/issues/56#issuecomment-454762429, or mute the thread https://github.com/notifications/unsubscribe-auth/AhnptQIbKZXQBNi-AVBO-slGCV6rsx0vks5vDxt2gaJpZM4aC70r .