Open clementperon opened 9 years ago
This is sort of available statically via https://github.com/ford-prefect/openwebrtc/commits/compressed
Here's a rough task breakdown to make all this work dynamically as well.
singledecodebin
and merge -- https://bugzilla.gnome.org/show_bug.cgi?id=743511interapp*
(https://bugzilla.gnome.org/show_bug.cgi?id=743510), but this would probably just end up being a subset of what we'd do for #240.Some compressed sources, such as rpicamsrc, support force key unit. If this is not yet proxied between the interapp* or whatever, I guess that needs to be supported. Otherwise you're basically forced to use periodic intra refresh.
After #240 is landed, we just need to deal with the conversion elements and proper signalling and configuration of profile, level etc where appropriate.
Hi,
We would need to take the H264 compressed video source as input in the openwebrtc stack.
In the following branch, which all commits I have to use for this functionality? https://github.com/ford-prefect/openwebrtc/commits/compressed
This requires changes in gstreamer as well?
Thanks.
Hi Ford-perfect,
We tried to apply the gst patch for the interappsrc and the interappsink. And the compilation also went fine.
But when we give the "gst-inspect-1.0 interappsrc", we are getting the following error.
(gst-plugin-scanner:24569): GStreamer-WARNING **: Failed to load plugin '/opt/openwebrtc-0.3/lib/gstreamer-1.0/libgstinter.so': /opt/openwebrtc-0.3/lib/gstreamer-1.0/libgstinter.so: undefined symbol: gst_inter_app_src_get_type No such element or plugin 'interappsrc'
Do you have any suggestion on this? Thanks.
Hi @suganthikarthick, it looks like your build is missing something, but I'm not sure.
That said, you probably want to redo this branch based on @sdroege's work from #240. You can likely reuse quite a bit of what I did.
Did gstinterappsrc.c get correctly compiled and linked into your plugin?
Just to clarify, the work from #240 should supersede the need for interapp altogether.
Hi Ford-perfect,
Now we have applied the patch for both gst-plugins-bad and for the openwebrtc stack and compiled and installed the libraries.
When running the application, we are getting the following errors.
0:00:10.393169524 12380 0xdcc290 ERROR owrtransportagent owr_transport_agent.c:776: handle_new_send_source: Failed to link "(null)" with transport bin
We tried using both custom-gst and with local video source as well. The video is a H264 compressed input. It would of great help if somebody has some suggestions on this issue.
Thank you.
Thanks.
Unfortunately I don't have the bandwidth to look at this right now. That said, I'd first go over each of my patches to validate that they are still applicable on the current OWR. The new owr inter elements should make my interapp work obsolete, and you might need to modify my patches a bit to work with master. The basic approach should be no different from what I did, it's mostly a matter of making sure the new inter elements are hooked up correctly to deal with compressed formats.
Thank you. We will look into it.
Also, whether we will be able to use custom-gst in the openwebrtc application for using this compressed video input?
Thanks.
Hi, I downloaded the openwebrtc compressed branch and compiled and tested with test-send-receive application. My camera is supporting H264 compressed input, but still I am not able to see any video when executing the test-send-receive application.
Which corresponding version of Gstreamer I have to use or you have any comments on this? The gstreamer version that we have is 1.5
Hi Ford Perfect,
We tested with opus encoded audio and that is working fine, but in case of H264 video, we are still facing some issues.
Below is the gst debug messages that I got and the error information.
I have applied the patch 0:00:26.259651822 ^[[331m12360^[[00m 0xddfa00 ^[[37mDEBUG ^[[00m ^[[00;01;34m GST_CAPS gstpad.c:2124:gst_pad_link_check_compatible_unlocked:bin3:src^[[00m src caps video/x-h264, stream-format=(string)avc, alignment=(string)au, profile=(string)baseline
0:00:26.259871706 ^[[331m12360^[[00m 0xddfa00 ^[[37mDEBUG ^[[00m ^[[00;01;34m GST_CAPS gstpad.c:2126:gst_pad_link_check_compatible_unlocked:
0:00:26.260045824 ^[[331m12360^[[00m 0xddfa00 ^[[37mDEBUG ^[[00m ^[[00;01;34m GST_CAPS gstpad.c:2144:gst_pad_link_check_compatible_unlocked:^[[00m caps are not compatible 0:00:26.260735113 ^[[331m12360^[[00m 0xddfa00 ^[[31;01mERROR ^[[00m ^[[00m owrtransportagent owr_transport_agent.c:780:handle_new_send_source:^[[00m Failed to link "(null)" with transport bin
Your suggestions will be valuable for us.
Thanks.
I don't see video/x-h264 in the second line. What are those the caps of? Are you able to use your camera for h.264 using a simple gst-launch pipeline? Like gst-launch-1.0 <yourcamerasrc> ! h264parse ! decodebin ! autovideosink
Yes, the camera is able to give x-h264. Also the src caps is fine, which is the one coming from camera. But the sink caps does not have x-h264.
Following are the lines from the log, and I am able to see the-h264 here.
0:00:14.913169664 ^[[332m25465^[[00m 0xcf8a00 ^[[37mDEBUG ^[[00m ^[[00m basetransform gstbasetransform.c:746:gst_base_transform_query_caps:send-input-video-flip-2:sink^[[00m peer caps video/x-h264, profile=(string)constrained-baseline; video/x-h264; video/x-raw, format=(string)I420, width=(int)[ 1, 32767 ], height=(int)[ 1, 32767 ], framerate=(fraction)[ 0/1, 2147483647/1 ]; video/x-raw, width=(int)[ 1, 32767 ], height=(int)[ 1, 32767 ], framerate=(fraction)[ 0/1, 2147483647/1 ], format=(string){ I420, YV12, YUY2, UYVY, AYUV, RGBx, BGRx, xRGB, xBGR, RGBA, BGRA, ARGB, ABGR, RGB, BGR, Y41B, Y42B, YVYU, Y444, v210, v216, NV12, NV21, NV16, NV61, NV24, GRAY8, GRAY16_BE, GRAY16_LE, v308, RGB16, BGR16, RGB15, BGR15, UYVP, A420, RGB8P, YUV9, YVU9, IYU1, ARGB64, AYUV64, r210, I420_10LE, I420_10BE, I422_10LE, I422_10BE, Y444_10LE, Y444_10BE, GBR, GBR_10LE, GBR_10BE, NV12_64Z32, A420_10LE, A420_10BE, A422_10LE, A422_10BE, A444_10LE, A444_10BE } 0:00:14.913220244 ^[[332m25465^[[00m 0x129b050 ^[[37mTRACE ^[[00m ^[[00m GST_LOCKING gstminiobject.c:248:gst_mini_object_unlock:^[[00m unlock 0x7fc6c4004ef0: state 00010101, access_mode 1
But in the below line, the x-h264 is getting removed.
0:00:14.913240161 ^[[332m25465^[[00m 0xcf8a00 ^[[37mDEBUG ^[[00m ^[[00m basetransform gstbasetransform.c:749:gst_base_transform_query_caps:send-input-video-flip-2:sink^[[00m our template video/x-raw, format=(string){ AYUV, ARGB, BGRA, ABGR, RGBA, Y444, xRGB, RGBx, xBGR, BGRx, RGB, BGR, I420, YV12, IYUV, YUY2, UYVY, YVYU, NV12, NV21, GRAY8, GRAY16_BE, GRAY16_LE }, width=(int)[ 1, 2147483647 ], height=(int)[ 1, 2147483647 ], framerate=(fraction)[ 0/1, 2147483647/1 ]
After that I am not able to see the x-h264 in the sink.
This isn't the right forum for support requests/questions. Can you please use the owr google group for this? Thanks.
I would actually prefer use of individual issues on GitHub for each separate support issue instead of using the Google Group.
OpenWebRTC should be able to use compressed source H264/VP8 for video src and PCMA/PCMU/Opus for audio src and try to negociate to use this codec as default to avoid a recoding/resampling.
There is already a small discussion about #172