BreeeZe / rpos

Raspberry Pi Onvif Server
http://breeeze.github.io/rpos
MIT License
659 stars 147 forks source link

USB camera support with GStreamer RTSP Server #79

Closed johnnyxwan closed 4 years ago

johnnyxwan commented 4 years ago

Added a GStreamer Pipeline to support USB camera with MJPEG profile (which is common). USB camera ----(MJPEG)----> omxmjpegdec ----(RAW)----> omxh264enc ----(H.264)----> RTSP Server Limited parameters and camera settings are ported, but a usable implementation.

RogerHardiman commented 4 years ago

Hi. This is a great new feature. Thanks for the PR. I will try and take a look at the weekend so we can get this into the main project

RogerHardiman commented 4 years ago

Just a thought for @johnnyxwan. Could you use alternative gstramer plugins so this would work on other devices - eg a generic MJPEG decoder and generic H264 encoder. I do like the way you have used the Pi's hardware acceleration which is fab. Just thinking about options for other Linux boxes or even over on my Mac or PC

johnnyxwan commented 4 years ago

@cmon69 sorry for late reply. Not sure the cause, but your log shows ENOMEM, might be a hint that you are out of ram. Run some commands to check the ram usage maybe?

johnnyxwan commented 4 years ago

@RogerHardiman thanks haha. Just try to diy a cheap little ip cam. Surely can use alt. pipeline, yet need to refactor the code to pass those configurations. Moreover, the code now is a bit messy since a lot of existing configurations are not available for USB camera. May need to change the whole rpos framework in order to put everything together nicely. Not sure if it worths the effort.

cmon69 commented 4 years ago

@cmon69 sorry for late reply. Not sure the cause, but your log shows ENOMEM, might be a hint that you are out of ram. Run some commands to check the ram usage maybe?

Thank you for the reply, I have my Raspberry Pi B set to 128, and am running Buster with desktop. I know the USB camera works because I can use it with VLC in the PI. I will look more closely at the memory , and try a headless setup without desktop installed. It might be awhile before I get to it because my son, and I are learning RUST so that we can better work with GStreamer. We are trying to develop something that can send a Raspberry Pi's desktop or webpage to my NVR. A question I have is, why does the camera have to be enabled in the settings when this only refers to a a camera using the serial interface, and not a USB camera?

johnnyxwan commented 4 years ago

@cmon69 sorry for late reply. Not sure the cause, but your log shows ENOMEM, might be a hint that you are out of ram. Run some commands to check the ram usage maybe?

Thank you for the reply, I have my Raspberry Pi B set to 128, and am running Buster with desktop. I know the USB camera works because I can use it with VLC in the PI. I will look more closely at the memory , and try a headless setup without desktop installed. It might be awhile before I get to it because my son, and I are learning RUST so that we can better work with GStreamer. We are trying to develop something that can send a Raspberry Pi's desktop or webpage to my NVR. A question I have is, why does the camera have to be enabled in the settings when this only refers to a a camera using the serial interface, and not a USB camera?

Check out my updated readme, if you use USB camera, should be fine with camera setting disabled (setting i mean raspi-config). Try it out!

RogerHardiman commented 4 years ago

Sorry this has taken so long to get to.

Thanks for the contribution.

RogerHardiman commented 4 years ago

Hi @johnnyxwan . Today is the first time I've got USB Camera Mode working on my Pi. (it never worked before but the PR looked good so was worth pulling)

with your knowledge of gstreamer, (or maybe v4l2-ctl) can we work out what the output format of 'vl2src' will be in the Python code?

My USB Camera does not send JPEG frames (or H264). It is an old Onmivision OV511 chipset camera that runs at 640x480 resolution and v4l2src outputs a raw YUV image. So I had to change my pipeline to

v4l2src device=/dev/video1 brightness=50 contrast=0 saturation=0 ! omxh264enc target-bitrate=1000000 control-rate=variable ! video/x-h264,profile=baseline ! h264parse ! rtph264pay name=pay0 pt=96

(so I have dropped the JPEG decode)

So I need an easy way to test what the output format of v4l2src will be and then I can decide if I need the JPEG decode step.

Any suggestions or pointers? I could spawn and parse v4l2-ctl but wondered if there was an easier way.

Thanks

Also @cmon69 , did you get it working?