Open Larryyuan2015 opened 2 months ago
This is a problem that comes from networking in the Android emulator. You need to forward the UDP packet from your PC to the Android emulator, Try to use this repository: https://github.com/fengjiongmax/fltgst-portforward
Alternatively, you can debug on a physical device.
@fengjiongmax I used a physical device to test the app, but still no video image. adb logcat as below
2024-04-26 11:24:05.803 1086-1911 ActivityManager pid-1086 I START u0 {act=android.intent.action.MAIN cat=[android.intent.category.LAUNCHER] flg=0x10200000 cmp=com.example.fltgst/.MainActivity bnds=[704,386][960,586] (has extras)} from uid 10047 on display 0 2024-04-26 11:24:07.990 2731-3116 libEGL com.example.fltgst E validate_display:99 error 3008 (EGL_BAD_DISPLAY) 2024-04-26 11:24:08.957 2731-2731 PlatformViewsController com.example.fltgst I Using hybrid composition for platform view: 0 2024-04-26 11:24:09.003 1086-1151 ActivityManager pid-1086 I Displayed com.example.fltgst/.MainActivity: +3s187ms
There are few variables in the process to make this work
Hi FJMax,
It is OK for another Terminal to play the video in the local Ubuntu system. gst-launch-1.0 udpsrc port=5000 ! application/x-rtp, payload=96 ! rtph264depay ! h264parse ! queue max-size-buffers=1 ! avdec_h264 output-corrupt=false ! videoconvert ! autovideosink sync=false
but it has no video image on the android physical device.
Read data files from: /usr/bin/../share/nmap Nmap done: 1 IP address (1 host up) scanned in 15.95 seconds Raw packets sent: 1091 (31.576KB) | Rcvd: 3171 (302.952KB)
4, I tried to open the Port 5000 as below. sudo sysctl net.ipv4.ip_forward=1 sudo iptables -A INPUT -p udp --dport 5000 -j ACCEPT
but it is no use because there is the same result for nmap command.
Does the app have network permission? https://developer.android.com/develop/connectivity/network-ops/connecting
I checked the app had the network permissions. android.permission.INTERNET android.permission.ACCESS_NETWORK_STATE
@fengjiongmax Could you help to use those two pipelines to try the udpsrc plug-in? Does it play images from videotestsrc on android physical device? Many Thanks!
I also tried the rtspsrc plugin but the result seems to be the same.
I used the same pipeline on the same physical device to play rtsp video via WiFi. The video source should be no problem, because the android app can play normally using the same pipeline, but there is no image when using it in the fltgst app. There is no problem in compiling the fltgst app. The log is as follows:
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I Rejecting re-init on previously-failed class java.lang.Class<io.flutter.embedding.android.FlutterActivity$1>: java.lang.NoClassDefFoundError: Failed resolution of: Landroid/window/OnBackInvokedCallback;
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at java.lang.Object java.lang.Class.newInstance!() (Class.java:-2)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at android.app.Activity android.app.Instrumentation.newActivity(java.lang.ClassLoader, java.lang.String, android.content.Intent) (Instrumentation.java:1079)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at android.app.Activity android.app.ActivityThread.performLaunchActivity(android.app.ActivityThread$ActivityClientRecord, android.content.Intent) (ActivityThread.java:2557)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.app.ActivityThread.handleLaunchActivity(android.app.ActivityThread$ActivityClientRecord, android.content.Intent, java.lang.String) (ActivityThread.java:2726)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.app.ActivityThread.-wrap12(android.app.ActivityThread, android.app.ActivityThread$ActivityClientRecord, android.content.Intent, java.lang.String) (ActivityThread.java:-1)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.app.ActivityThread$H.handleMessage(android.os.Message) (ActivityThread.java:1477)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.os.Handler.dispatchMessage(android.os.Message) (Handler.java:102)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.os.Looper.loop() (Looper.java:154)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.app.ActivityThread.main(java.lang.String[]) (ActivityThread.java:6119)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at java.lang.Object java.lang.reflect.Method.invoke!(java.lang.Object, java.lang.Object[]) (Method.java:-2)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run() (ZygoteInit.java:886)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void com.android.internal.os.ZygoteInit.main(java.lang.String[]) (ZygoteInit.java:776)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I Caused by: java.lang.ClassNotFoundException: Didn't find class "android.window.OnBackInvokedCallback" on path: DexPathList[[zip file "/data/app/com.example.fltgst-2/base.apk"],nativeLibraryDirectories=[/data/app/com.example.fltgst-2/lib/arm64, /data/app/com.example.fltgst-2/base.apk!/lib/arm64-v8a, /system/lib64, /vendor/lib64]]
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at java.lang.Class dalvik.system.BaseDexClassLoader.findClass(java.lang.String) (BaseDexClassLoader.java:56)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at java.lang.Class java.lang.ClassLoader.loadClass(java.lang.String, boolean) (ClassLoader.java:380)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at java.lang.Class java.lang.ClassLoader.loadClass(java.lang.String) (ClassLoader.java:312)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at java.lang.Object java.lang.Class.newInstance!() (Class.java:-2)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at android.app.Activity android.app.Instrumentation.newActivity(java.lang.ClassLoader, java.lang.String, android.content.Intent) (Instrumentation.java:1079)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at android.app.Activity android.app.ActivityThread.performLaunchActivity(android.app.ActivityThread$ActivityClientRecord, android.content.Intent) (ActivityThread.java:2557)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.app.ActivityThread.handleLaunchActivity(android.app.ActivityThread$ActivityClientRecord, android.content.Intent, java.lang.String) (ActivityThread.java:2726)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.app.ActivityThread.-wrap12(android.app.ActivityThread, android.app.ActivityThread$ActivityClientRecord, android.content.Intent, java.lang.String) (ActivityThread.java:-1)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.app.ActivityThread$H.handleMessage(android.os.Message) (ActivityThread.java:1477)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.os.Handler.dispatchMessage(android.os.Message) (Handler.java:102)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.os.Looper.loop() (Looper.java:154)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void android.app.ActivityThread.main(java.lang.String[]) (ActivityThread.java:6119)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at java.lang.Object java.lang.reflect.Method.invoke!(java.lang.Object, java.lang.Object[]) (Method.java:-2)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run() (ZygoteInit.java:886)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I at void com.android.internal.os.ZygoteInit.main(java.lang.String[]) (ZygoteInit.java:776)
2024-04-29 09:51:29.876 4376-4376 art com.example.fltgst I
2024-04-29 09:51:29.898 4376-4376 art com.example.fltgst I Rejecting re-init on previously-failed class java.lang.Class<io.flutter.embedding.engine.FlutterJNI$$ExternalSyntheticLambda0>: java.lang.NoClassDefFoundError: Failed resolution of: Landroid/graphics/ImageDecoder$OnHeaderDecodedListener;
2024-04-29 09:51:29.898 4376-4376 art com.example.fltgst I at io.flutter.embedding.engine.FlutterJNI io.flutter.embedding.engine.FlutterJNI$Factory.provideFlutterJNI() (FlutterJNI.java:127)
2024-04-29 09:51:29.898 4376-4376 art com.example.fltgst I at void io.flutter.FlutterInjector$Builder.fillDefaults() (FlutterInjector.java:169)
2024-04-29 09:51:29.898 4376-4376 art com.example.fltgst I at io.flutter.FlutterInjector io.flutter.FlutterInjector$Builder.build() (FlutterInjector.java:179)
2024-04-29 09:51:29.898 4376-4376 art com.example.fltgst I at io.flutter.FlutterInjector io.flutter.FlutterInjector.instance() (FlutterInjector.java:57)
2024-04-29 09:51:29.898 4376-4376 art com.example.fltgst I at void io.flutter.embedding.engine.FlutterEngineGroup.
2024-04-29 09:51:29.928 4376-4393 ResourceExtractor com.example.fltgst I Found extracted resources res_timestamp-1-1714334502979
2024-04-29 09:51:29.939 4376-4376 System com.example.fltgst W ClassLoader referenced unknown path:
2024-04-29 09:51:29.941 4376-4376 ApplicationLoaders com.example.fltgst D ignored Vulkan layer search path /data/app/com.example.fltgst-2/lib/arm64:/data/app/com.example.fltgst-2/base.apk!/lib/arm64-v8a for namespace 0x7fa13650f0
2024-04-29 09:51:29.965 4376-4376 Adreno com.example.fltgst I QUALCOMM build : 7f9221e, I45b30eba69
Build Date : 01/24/17
OpenGL ES Shader Compiler Version: XE031.09.00.04
Local Branch :
Remote Branch :
Remote Branch :
Reconstruct Branch :
2024-04-29 09:51:30.145 4376-4404 OpenGLRenderer com.example.fltgst I Initialized EGL, version 1.4
2024-04-29 09:51:30.145 4376-4404 OpenGLRenderer com.example.fltgst D Swap behavior 1
2024-04-29 09:51:30.373 4376-4409 flutter com.example.fltgst I The Dart VM service is listening on http://127.0.0.1:58505/eaEC7Oa0tw0=/
2024-04-29 09:51:32.753 4376-4395 libEGL com.example.fltgst E validate_display:99 error 3008 (EGL_BAD_DISPLAY)
2024-04-29 09:51:33.728 4376-4376 PlatformViewsController com.example.fltgst I Using hybrid composition for platform view: 0
2024-04-29 09:51:33.859 1063-1155 ActivityManager pid-1063 I Displayed com.example.fltgst/.MainActivity: +4s103ms
Did you update your CMakeLists.txt? Because your pipeline was different from the previous issue, and with different elements, the libraries required by those elements were different.
You can see if libraries were missing by creating elements with gst_element_factory_make
, and then a null check.
I udpated the CMakeLists.txt as follows:
LIST(APPEND GST_PLUGINS coreelements coretracers adder app audioconvert audiorate audiotestsrc videotestsrc videorate videofilter videoconvertscale videoparsersbad udp gio autodetect opensles ipcpipeline opengl playback rtp rawparse androidmedia libav isomp4 rtsp)
LIST(APPEND LINK_LIBS intl ffi iconv gmodule-2.0 pcre2-8 gstbase-1.0 gstaudio-1.0 gstvideo-1.0 gstgl-1.0 gstcontroller-1.0 png16 graphene-1.0 jpeg orc-0.4 gstapp-1.0 gio-2.0 android log z OpenSLES EGL GLESv2 avutil avcodec avformat gstpbutils-1.0 avfilter swresample bz2 gsttag-1.0 gstcodecparsers-1.0 gstnet-1.0 gstrtp-1.0 gstphotography-1.0 gstrtsp gstrtsp-1.0 gstrtspclientsink gstrtspserver-1.0 gstsdp-1.0 gstx264 gstisomp4 gstriff-1.0)
and I tried in another Physical Device and debug the code as follows:
FFI_PLUGIN_EXPORT void setup_pipeline(void)
{
GError* error = NULL;
// Setup pipeline
gchar *pipeline_description = g_strdup("rtspsrc location=rtsp://192.168.1.1/live/main_stream latency=0 ! rtph265depay ! h265parse ! avdec_h265 ! videoconvert ! autovideosink sync=false");
//gchar *pipeline_description = g_strdup("udpsrc port=5000 ! application/x-rtp, payload=96 ! rtph264depay ! h264parse ! queue max-size-buffers=1 ! avdec_h264 output-corrupt=false ! videoconvert ! autovideosink sync=false");
data->pipeline = gst_parse_launch(pipeline_description, &error);
g_free(pipeline_description);
if ( error != NULL ) {
printf ("Could not construct pipeline: %s", error->message);
g_clear_error (&error);
}
if (!data->pipeline) {
g_printerr("Pipeline could not be created.\n");
return;
}
// Set pipeline state
gst_element_set_state(data->pipeline, GST_STATE_READY);
// Get overlay interface
data->overlay = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_VIDEO_OVERLAY);
}
the pipeline was created with no error. But there is still no image. some logs as blow:
2024-04-29 10:06:21.931 3677-22996 Quality-OIDTUtils com.oplus.persist.system I Exception type:2; pkgName:com.example.fltgst;
2024-04-29 10:06:22.134 5000-5000 PlatformViewsController com.example.fltgst I Using hybrid composition for platform view: 0
2024-04-29 10:06:22.136 1163-1163 OplusLayer surfaceflinger D setBuffer sequence=115769, name=SurfaceView[com.example.fltgst/com.example.fltgst.MainActivity](BLAST)#0
2024-04-29 10:06:22.136 5000-8687 SurfaceComposerClient com.example.fltgst D VRR [FRTC] client handle [bufferId:18446744073709551615 framenumber:0] [ffffffff, ffffffff]
2024-04-29 10:06:22.142 5000-5000 Quality com.example.fltgst I Skipped: false 1 cost 18.973171 refreshRate 0 processName com.example.fltgst
2024-04-29 10:06:22.146 1951-2008 OplusDisplayPolicy system_server D com.example.fltgst, no change cutoutMode: 0
2024-04-29 10:06:22.178 5000-5000 ViewRootIm...nActivity] com.example.fltgst D debugCancelDraw cancelDraw=false,count = 63,android.view.ViewRootImpl@dbea269
2024-04-29 10:06:22.183 5000-5000 BLASTBufferQueue com.example.fltgst E BLASTBufferItemConsumer::onDisconnect()
2024-04-29 10:06:22.190 1163-1163 OplusLayer surfaceflinger D setBuffer sequence=115764, name=com.example.fltgst/com.example.fltgst.MainActivity#0
2024-04-29 10:06:22.190 5000-8687 SurfaceComposerClient com.example.fltgst D VRR [FRTC] client handle [bufferId:18446744073709551615 framenumber:0] [ffffffff, ffffffff]
https://github.com/fengjiongmax/fltgst/issues/4#issuecomment-2032360916
Please use the gst_element_factory_make
to create elements.
And in our case, when debugging your application, it's better to use gst_element_factory_make
, so that you know if you are missing something.
Replace GST_PLUGINS
with the following:
LIST(APPEND GST_PLUGINS coreelements coretracers adder app audioconvert audiorate audiotestsrc videotestsrc videorate videofilter videoconvertscale videoparsersbad udp gio autodetect opensles ipcpipeline opengl playback rtp rawparse androidmedia libav rtpmanagerbad)
I changed the source to rtsp stream with H264. and I use the following pipeline in another app(not flutter) which also can play normally on the same physical device. pipeline = "rtspsrc location=rtsp://192.168.1.1/live/main_stream latency=0 ! rtph264depay ! h264parse ! avdec_h264 ! videoconvert ! autovideosink sync=false";
I changed the gst_element_factory_make to create elements below main code. I debuged and can't found the problem. but it is still no image.
/ source new src pad create / static void RtspSrcPadAdded_callback(GstElement src, GstPad new_pad, gpointer user_data) { FltGstData data = (FltGstData )user_data;
GstPad *sink_pad = gst_element_get_static_pad(data->rtpdepay, "sink");
GstCaps *p_caps;
gchar *description;
GstPadLinkReturn ret;
GstCaps *new_pad_caps = NULL;
GstStructure *new_pad_struct = NULL;
const gchar *new_pad_type = NULL;
g_print("Received new pad '%s' from '%s':\n", GST_PAD_NAME(new_pad), GST_ELEMENT_NAME(src));
if (gst_pad_is_linked(sink_pad))
{
g_print("We are already linked. Ignoring.\n");
goto exit;
}
// here, you would setup a new pad link for the newly created pad
// so, now find that rtph264depay is needed and link them?
p_caps = gst_pad_get_pad_template_caps(new_pad);
description = gst_caps_to_string(p_caps);
g_print("new pad caps: %s\n", description);
g_free(description);
if (NULL != p_caps)
gst_caps_unref(p_caps);
/* Attempt the link */
/* Check the new pad's type */
new_pad_caps = gst_pad_get_current_caps(new_pad);
new_pad_struct = gst_caps_get_structure(new_pad_caps, 0);
new_pad_type = gst_structure_get_name(new_pad_struct);
if (!g_str_has_prefix(new_pad_type, "application/x-rtp"))
{
g_print("It has type '%s' which is not application/x-rtp. Ignoring.\n", new_pad_type);
goto exit;
}
ret = gst_pad_link(new_pad, sink_pad); // link
if (GST_PAD_LINK_FAILED(ret))
{
g_print("Type is '%s' but link failed.\n", new_pad_type);
}
else
{
g_print("Link succeeded (type '%s').\n", new_pad_type);
}
if (NULL != new_pad_caps)
gst_caps_unref(p_caps);
exit:
if (sink_pad != NULL)
gst_object_unref(sink_pad);
}
FFI_PLUGIN_EXPORT void setup_pipeline(void) { // Setup pipeline data->source = gst_element_factory_make("rtspsrc", "source"); data->rtpdepay = gst_element_factory_make("rtph264depay", NULL); data->rtp_queue = gst_element_factory_make("queue", "rtp_queue"); data->h264parse = gst_element_factory_make("h264parse", NULL); data->h264_queue = gst_element_factory_make("queue", "h264_queue");
data->decoder = gst_element_factory_make("avdec_h264", NULL);
data->videoconvert = gst_element_factory_make("videoconvert", NULL);
data->video_queue = gst_element_factory_make("queue", "video_queue");
data->sink = gst_element_factory_make("autovideosink", "sink"); // autovideosink, fakesink, appsink, filesink
// Check if elements are created successfully
if (!data->source || !data->rtpdepay || !data->rtp_queue || !data->h264parse || !data->decoder || !data->videoconvert || !data->video_queue || !data->sink) {
g_printerr("Failed to create elements. Exiting.\n");
// Clean up
if (data->source) gst_object_unref(data->source);
if (data->rtpdepay) gst_object_unref(data->rtpdepay);
if (data->h264parse) gst_object_unref(data->h264parse);
if (data->decoder) gst_object_unref(data->decoder);
if (data->videoconvert) gst_object_unref(data->videoconvert);
if (data->sink) gst_object_unref(data->sink);
return;
}
// Set the RTSP location
g_object_set(G_OBJECT(data->source), "location", "rtsp://192.168.1.1/live/main_stream", "latency", 200, NULL);
// Create the pipeline
data->pipeline = gst_pipeline_new("test-pipeline");
if (!data->pipeline) {
g_printerr("Pipeline could not be created.\n");
return;
}
g_object_set(G_OBJECT(data->sink),
"sync", FALSE,
"emit-signals", TRUE,
"caps", gst_caps_new_simple("video/x-raw",
"format", G_TYPE_STRING, "NV12", NULL),
NULL);
// Add elements to the pipeline
gst_bin_add_many(GST_BIN(data->pipeline), data->source, data->rtpdepay, data->rtp_queue, data->h264parse, data->h264_queue, data->decoder, data->videoconvert, data->video_queue, data->sink, NULL);
g_signal_connect(data->source, "pad-added", G_CALLBACK(RtspSrcPadAdded_callback), data->rtpdepay);
if (!gst_element_link_many(data->rtpdepay, data->rtp_queue, data->h264parse, data->decoder, data->videoconvert, data->video_queue, NULL))
{
g_printerr("@@@ OpenRtsp: Failed to link rtpdepay -> video_queue\n");
return;
}
if (!gst_element_link_many(data->video_queue, data->sink, NULL))
{
g_printerr("@@@ OpenRtsp: Failed to link video_queue -> sink\n");
return;
}
g_print("All elements Link success\n");
// Set pipeline state
gst_element_set_state(data->pipeline, GST_STATE_READY);
// Get overlay interface
data->overlay = gst_bin_get_by_interface(GST_BIN(data->pipeline), GST_TYPE_VIDEO_OVERLAY);
}
Please do your own research.
I used the tcpdump to capture the RTP data which can be all received normally from rtsp server on physical device. and see the pcap dump file by Wireshark and compared with the dump data using android gstreamer App. they are almost the same.
14 0.274021 192.168.1.93 192.168.1.1 RTSP 229 PLAY rtsp://192.168.1.1/live/main_stream RTSP/1.0 20 0.297753 192.168.1.1 192.168.1.93 RTSP 164 Reply: RTSP/1.0 200 OK 69 0.855230 192.168.1.1 192.168.1.93 H264 79 PT=DynamicRTP-Type-96, SSRC=0x22345678, Seq=5558, Time=7603031, Mark SPS 70 0.855259 192.168.1.1 192.168.1.93 H264 58 PT=DynamicRTP-Type-96, SSRC=0x22345678, Seq=5559, Time=7603031, Mark PPS 71 0.855541 192.168.1.1 192.168.1.93 H264 1498 PT=DynamicRTP-Type-96, SSRC=0x22345678, Seq=5560, Time=7603031 FU-A Start:IDR-Slice 72 0.855861 192.168.1.1 192.168.1.93 H264 1498 PT=DynamicRTP-Type-96, SSRC=0x22345678, Seq=5561, Time=7603031 FU-A ....
I think the rtsp protocol network transport layer is fine. But I still have no idea how to further locate the problem. Do you have any experience successfully using the rtspsrc plug-in in flutter?
I think it should be no problem to setup pipeline in flutter app by single step tracing debugging through android studio. In addition, With the same method as flutter, I use the gst_element_factory_make to create all elements to implement rtsp client on PC Ubuntu system. It can work to play the same rtsp source normally via WiFi.
Please provide the following information;
native_binding.c
app/native_binding/src/CMakeLists.txt
And please use the code block following markdown syntax
@fengjiongmax I attached the files,please help to check them. Thank you so much!
// Set the RTSP location
g_object_set(G_OBJECT(data->source), "location", "rtsp://192.168.1.1/live/main_stream", "latency", 0, NULL);
g_object_set (G_OBJECT (data->sink), "sync", FALSE, NULL);
// Add elements to the pipeline
gst_bin_add_many(GST_BIN(data->pipeline), data->source, data->rtpdepay, data->h264parse, data->decoder, data->video_queue, data->videoconvert, data->sink, NULL);
// Link confirmation
if (!gst_element_link_many (data->rtpdepay, data->h264parse, data->decoder, NULL)){
g_warning ("Linking part (A)-1 Fail...");
return;
}
// Link confirmation
if (!gst_element_link_many (data->video_queue, data->videoconvert, data->sink, NULL)){
g_warning ("Linking part (A)-2 Fail...");
return;
}
// Dynamic Pad Creation
if(! g_signal_connect (data->source, "pad-added", G_CALLBACK (on_pad_added),data->rtpdepay))
{
g_warning ("Linking part (1) with part (A)-1 Fail...");
}
// Dynamic Pad Creation
if(! g_signal_connect (data->decoder, "pad-added", G_CALLBACK (on_pad_added),data->video_queue))
{
g_warning ("Linking part (2) with part (A)-2 Fail...");
}
You did not link the source data->source
to the data->rtpdepay
element.
Hi FJMax,
My link pad method refers to this code https://github.com/enthusiasticgeek/gstreamer-rtsp-ssl-example. I modified rtsp_client.c file. It can be compiled and run on the Ubuntu system: ./rtsp_client rtsp:// 192.168.1.1/live/main_stream This RTSP video stream can be played.
I traced in android studio debugging,
if(! g_signal_connect (data->source, "pad-added", G_CALLBACK (on_pad_added),data->rtpdepay))
{
g_warning ("Linking part (1) with part (A)-1 Fail...");
}
Because the g_warning line of code was not entered, I thought the link was normal at that time.
But when I set a breakpoint in the on_pad_added function, it never entered. How did you determine that there was no link the source data->source to the data->rtpdepay element?
I missed that, not sure why the repo was designed this way.
And for callbacks, you need a GMainloop running: https://gstreamer.freedesktop.org/documentation/application-development/basics/bus.html#bus
Can you try a simpler pipeline? I think you need more knowledge of GStreamer, and more time reading the documentation.
If you want to keep using this pipeline, you can add a function to start the GMainLoop: https://github.com/fengjiongmax/fltgst/blob/99ebdc1b811bc5ed9ff36063d4e45d0c8a77d6dc/app/native_binding/src/native_binding.c#L68-L79 You can reference this commit: 99ebdc1b811bc5ed9ff36063d4e45d0c8a77d6dc , or watch the second video of the playlist.
Hi FJMax, @fengjiongmax I tested udpsrc plugin with fltgst project but failed. for server I used on the PC terminal: gst-launch-1.0 -v videotestsrc ! "video/x-raw,framerate=30/1" ! x264enc key-int-max=30 ! rtph264pay ! udpsink host=127.0.0.1 port=5000
It can play with below pipeline on another PC terminal. gst-launch-1.0 udpsrc port=5000 ! application/x-rtp,payload=96 ! rtph264depay ! decodebin ! autovideosink sync=false
Then I use the pipeline on android app based on fltgst project.
There was no error to build the app. and There was no error to setup pipeline like below when debug it. But the app had no video image. Could you help me out? Thank you very much!
FFI_PLUGIN_EXPORT void setup_pipeline(void) { GError error = NULL; // Setup pipeline gchar pipeline_description = g_strdup("udpsrc port=5000 ! application/x-rtp, payload=96 ! rtph264depay ! decodebin ! autovideosink sync=false");
}