justinjoy / gst-android-camera

GNU Lesser General Public License v2.1
32 stars 9 forks source link

Sending camera frames with UDPsink #2

Closed the-information-guy closed 6 years ago

the-information-guy commented 6 years ago

Hey, just wondering if you could help me.

Trying to send the camera frames to an IP via UDP. I tried to change the glimagesink by a udpsink but I have no more camera preview if I do so and when I get the packets, they are not displaying, see :

Image

GstElement* udpsink = gst_element_factory_make ("udpsink", NULL);
g_object_set (udpsink, "host", "192.168.43.3", "port", 5200, "sync", "false", NULL);

ahc->vsink = udpsink;

ahc->pipeline = gst_pipeline_new("camera-pipeline");

gst_bin_add_many(GST_BIN (ahc->pipeline), ahc->ahcsrc, ahc->filter, enc, pay, rnd, ahc->vsink, NULL);
gst_element_link_many(ahc->ahcsrc, ahc->filter, enc, pay, rnd, ahc->vsink, NULL);

Any idea how I could manage to do this ? :)

Thanks

justinjoy commented 6 years ago

Your pipeline here seems to be okay, but I wonder which caps you used for udpsrc in the client side, and do you think pay and rnd work correctly? I think it happens usually by mismatched caps between udpsink and udpsrc.

the-information-guy commented 6 years ago

Hmm, here are all my elements :

GstElement* enc = gst_element_factory_make ("jpegenc", NULL);
GstElement* pay = gst_element_factory_make ("rtpjpegpay", NULL);
GstElement* rnd = gst_element_factory_make ("rndbuffersize", NULL);
g_object_set (rnd, "max", 1316, "min", 1, NULL);

Using the following to read the stream :

gst-launch-1.0 -v udpsrc port=5200 ! videoparse ! decodebin ! autovideosink sync=false

Any idea of an error ? Or a pipeline that you think will work ?

justinjoy commented 6 years ago

you use the wrong pipeline to read the stream. In the client side, you should use rtpjpegdepay to get packetized stream. I don't test, but the below pipeline might work.

  udpsrc port=5200 ! rtpjpegdepay ! decodebin ! autovideosink
the-information-guy commented 6 years ago

Getting the following error :

No RTP format was negotiated.

Also about the no-preview with UDPsink, found the following error :

invalid cast from 'GstUDPSink' to 'GstVideoOverlay' gst_video_overlay_set_window_handle: assertion 'GST_IS_VIDEO_OVERLAY (overlay)' failed

justinjoy commented 6 years ago

okay, here are some problems.

I apologize the pipeline I mentioned is a pseudo combination. To make it work, you should also put caps manually when you use 'udpsrc'.

In your case, the fixed caps of rtpjpegpay src pad on andoroid. I guess it would be application/x-rtp,media=(string)video,clock-rate=(int)9000,payload=(int)96,encoding-name=(string)JPEG,width=(int)320,height=(int)240, but you must check exact caps on your application.

then you should set the caps to udpsrc.

  udpsrc port=xxx caps='xxx' ! ... 

Also about the no-preview with UDPsink, found the following error :

Yes, it won't show preview because udpsink is a packet sender over the network, not a renderer for a device. If you want to send packets and see preview simultaneously, you should use tee element, then attach glimagesink and udpsink properly.

the-information-guy commented 6 years ago

Yep, setting the caps but always fall with the same error :

gstrtpbasedepayload.c(492): gst_rtp_base_depayload_handle_buffer (): /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0:
Received invalid RTP payload, dropping

Do you have more ideas ? :)

Maybe I should not go with jpeg encoding but h264 for example does not send packets using x264enc and rtph264pay ...


Yes, it won't show preview because udpsink is a packet sender over the network, not a renderer for a device. If you want to send packets and see preview simultaneously, you should use tee element, then attach glimagesink and udpsink properly.

Yep, already tried to set it up using a tee and two queues but feels hard to get it working. Gonna check it out again.

By the way, thanks for the great help sir :)

justinjoy commented 6 years ago

Do you have more ideas ? :)

If I translate it verbatim, payload is mismatched. However, unless seeing logs, I am not sure. Have you checked the value of payload from your android?

Maybe I should not go with jpeg encoding but h264 for example does not send packets using x264enc and rtph264pay ...

I think your problem is mismatched caps between pay and depay, rather than encoding type, but, in general, h264 is more famous for video streaming case over the network. :)

By the way, thanks for the great help sir :)

You're welcome.

justinjoy commented 6 years ago

@the-information-guy I am wondering if you can play remotely.

the-information-guy commented 6 years ago

Got the tee working properly so I have a preview + udpsink sending packets.

Still, when trying to decode, facing the Received invalid RTP payload, dropping error :/

Do you know how I could check the value of the payload ? Is there some special parameters to set on the client side in the pipeline like sprop-parameter-sets to get through this error maybe ?

I am wondering if you can play remotely.

What do you actually mean by that ? So far, I can send the camera frames from a PC (server) to a phone (client) and check it on a surface view but cannot get it working the other way.

justinjoy commented 6 years ago

What do you actually mean by that ?

I meant you could play the stream from the phone over UDP.

So far, I can send the camera frames from a PC (server) to a phone (client) and check it on a surface view but cannot get it working the other way.

Confusing. Sending camera frames from PC to phone? I think your question is about how to send camera frames from udpsink on Android to udpsrc on PC.

Anyway, it would be helpful if you share GST_DEBUG=5 logs of both pipelines (Android and PC). Unless analyzing the logs, everything in discussion will be surmise.

the-information-guy commented 6 years ago

Seems like the issue was with the rndbuffersize Deleted it, now that works smoothly :+1:

Thanks for the help :)

guidoschmidt commented 3 years ago

@justinjoy @the-information-guy I'm also trying to pipe the camera stream to an udpsink and receive it on the network from another computer. How did you solve it in the end? What I got so far:

static void *
app_function (void *userdata)
{
  JavaVMAttachArgs args;
  GstBus *bus;
  GstMessage *msg;
  GstAhc *ahc = (GstAhc *) userdata;
  GSource *bus_source;
  GMainContext *context;

  GST_DEBUG ("Creating pipeline in GstAhc at %p", ahc);

  /* create our own GLib Main Context, so we do not interfere with other libraries using GLib */
  context = g_main_context_new ();

  // SETUP: Android camera source
  ahc->ahcsrc = gst_element_factory_make ("ahcsrc", "ahcsrc");
  g_object_set (ahc->ahcsrc, "device", "0", NULL);
  // SETUP: caps filter
  ahc->filter = gst_element_factory_make ("capsfilter", NULL);
  GstElement* new_caps = gst_caps_new_simple ("video/x-raw",
                                  "width", G_TYPE_INT, 640,
                                  "height", G_TYPE_INT, 480,
                                  NULL);
  g_object_set (ahc->filter,"caps", new_caps, NULL);
  gst_caps_unref (new_caps);
  // SETUP: video sink
  ahc->vsink = gst_element_factory_make ("glimagesink", "vsink");
  // SETUP: udp sink
  GstElement* udp_sink = gst_element_factory_make("udpsink", NULL);
  // SETUP: Queues
  ahc->video_queue = gst_element_factory_make("queue", "video_queue");
  ahc->udp_queue = gst_element_factory_make("queue", "udp_queue");
  // SETUP: tee
  ahc->tee = gst_element_factory_make("tee", "tee");
  // SETUP: h264 encoding
  GstElement* enc = gst_element_factory_make("x264enc", NULL);
  // SETUP: h264 payload
  GstElement* pay = gst_element_factory_make("rtph264pay", NULL);
  // SETUP: videoscale
  GstElement* videoscale = gst_element_factory_make("videoscale", NULL);
  // SETUP: videotestsrc
  GstElement* videotestsrc = gst_element_factory_make("videotestsrc", NULL);
  //ahc->ahcsrc = videotestsrc;
  // SETUP: videoscale
  GstElement* videoconvert = gst_element_factory_make("videoconvert", NULL);
  // SETUP: pipeline
  ahc->pipeline = gst_pipeline_new ("camera-pipeline");

  // Test pipeline:
  // gst-launch-1.0 -v videotestsrc ! video/x-raw,framerate=20/1 ! videoscale ! videoconvert ! x264enc tune=zerolatency bitrate=500 speed-preset=superfast ! rtph264pay ! udpsink host=127.0.0.1 port=5000

  __android_log_print(ANDROID_LOG_DEBUG, "ahc", "\n ahc->pipeline %d \n", !(ahc->pipeline));
  __android_log_print(ANDROID_LOG_DEBUG, "ahc","ahc->ahcsrc %d", !(ahc->ahcsrc));
  __android_log_print(ANDROID_LOG_DEBUG, "ahc","ahc->vsink %d", !(ahc->vsink));
  __android_log_print(ANDROID_LOG_DEBUG, "ahc","ahc->filter %d", !(ahc->filter));
  __android_log_print(ANDROID_LOG_DEBUG, "ahc","ahc->udpsink %d", !(udp_sink));
  __android_log_print(ANDROID_LOG_DEBUG, "ahc","ahc->video_queue %d", !(ahc->video_queue));
  __android_log_print(ANDROID_LOG_DEBUG, "ahc","ahc->udp_queue %d", !(ahc->udp_queue));
  __android_log_print(ANDROID_LOG_DEBUG, "ahc","ahc->tee %d", !(ahc->tee));

  if (!ahc->pipeline || !ahc->ahcsrc || !ahc->vsink || !ahc->filter || !udp_sink || !ahc->video_queue || !ahc->udp_queue || !ahc->tee) {
    g_printerr ("Not all elements could be created.\n");
    return -1;
  }

  g_object_set (udp_sink, "host", "192.168.2.149", "port", 5000, "sync", "false", NULL);

  gst_bin_add_many (GST_BIN (ahc->pipeline),
    ahc->ahcsrc,
    ahc->tee,
    ahc->video_queue,
    ahc->filter,
    ahc->vsink,

    ahc->udp_queue,
    videoscale,
    videoconvert,
    enc,
    pay,
    udp_sink,
    NULL);

  gst_element_link_many (
          ahc->ahcsrc,
          ahc->tee,
          ahc->video_queue,
          ahc->filter,
          ahc->vsink,

          ahc->udp_queue,
          enc,
          pay,
          udp_sink,
          NULL);

  if (ahc->native_window) {
    GST_DEBUG ("Native window already received, notifying the vsink about it.");
    gst_video_overlay_set_window_handle (GST_VIDEO_OVERLAY (ahc->vsink), (guintptr) ahc->native_window);
  }

  /* Instruct the bus to emit signals for each received message, and connect to the interesting signals */
  bus = gst_element_get_bus (ahc->pipeline);
  bus_source = gst_bus_create_watch (bus);
  g_source_set_callback (bus_source, (GSourceFunc) gst_bus_async_signal_func,NULL, NULL);
  g_source_attach (bus_source, context);
  g_source_unref (bus_source);
  g_signal_connect (G_OBJECT (bus), "message::error", G_CALLBACK (on_error), ahc);
  g_signal_connect (G_OBJECT (bus), "message::eos", (GCallback) eos_cb, ahc);
  g_signal_connect (G_OBJECT (bus), "message::state-changed",
      (GCallback) state_changed_cb, ahc);
  gst_object_unref (bus);

  /* Create a GLib Main Loop and set it to run */
  GST_DEBUG ("Entering main loop... (GstAhc:%p)", ahc);
  ahc->main_loop = g_main_loop_new (context, FALSE);
  check_initialization_complete (ahc);
  g_main_loop_run (ahc->main_loop);
  GST_DEBUG ("Exited main loop");
  g_main_loop_unref (ahc->main_loop);
  ahc->main_loop = NULL;

  /* Free resources */
  g_main_context_unref (context);
  gst_element_set_state (ahc->pipeline, GST_STATE_NULL);
  gst_object_unref (ahc->vsink);
  gst_object_unref (ahc->filter);
  gst_object_unref (ahc->ahcsrc);
  gst_object_unref (ahc->pipeline);

  return NULL;
}

Unfortunatelly I do not get anything on the udpsrc receiver 🤔 though I didn't get any warnings or errors printed in logcat.

justinjoy commented 3 years ago

Unfortunatelly I do not get anything on the udpsrc receiver 🤔 though I didn't get any warnings or errors printed in logcat.

In that case, you should check the sender pipeline first. If you are sure the pipeline works, then I think it could be a network environment problem.