FDH2 / UxPlay

AirPlay Unix mirroring server
GNU General Public License v3.0
1.35k stars 72 forks source link

Stream video output #162

Closed iw-an closed 1 year ago

iw-an commented 1 year ago

How would it be possible to expose the rtsp stream so that I can for example use VLC to play in the stream? It seems like it should be possible by just giving it the correct local address

fduncanh commented 1 year ago

You are asking if the video+audio streamed from the iOS device can be sent to VLC ? (to avoid gstreamer ,or using it?)

AirPlay streams video and audio separately as two encrypted streams with RAOP protocol, with time stamps allowing them to be synchronized. These are decrypted separately and fed into two parallel gstreamer pipelines, one for video one for audio. RAOP is slighlty modified rtsp, I believe. These pass from RAOP in /lib using callbacks to uxplay.cpp and are sent to gstreamer in /renderers, so could be intercepted in uxplay.cpp.

extern "C" void audio_process (void *cls, raop_ntp_t *ntp, audio_decode_struct *data) {
    if (dump_audio) {
        dump_audio_to_file(data->data, data->data_len, (data->data)[0] & 0xf0);
    }
    if (use_audio) {
        audio_renderer_render_buffer(ntp, data->data, data->data_len, data->ntp_time, data->rtp_time, data->seqnum);
    }
}

extern "C" void video_process (void *cls, raop_ntp_t *ntp, h264_decode_struct *data) {
    if (dump_video) {
        dump_video_to_file(data->data, data->data_len);
    }
    if (use_video) {
        video_renderer_render_buffer(ntp, data->data, data->data_len, data->pts, data->nal_count);
    }
}

In principle a modified gstreamer pipeline could be built to recombine them into a standard mpg4 format that could be retransmitted, or sent to VLC, instead of rendering them separately, but uxplay is not set up to do that.

I guess it would be a major gstreamer pipeline construction task. The video pipeline is fully user-configurable with uxplay options, so possibly the video (only) could be sent to VLC using the -vs option with some suitable gstreamer videosink, without too much trouble, if you are a gstreamer guru!

iw-an commented 1 year ago

As an update we got this working using the following pipeline:

webrtcsink name=video_sink appsrc block=true do-timestamp=true name=video_source format=3 stream-type=0 is-live=true ! queue ! decodebin ! videoconvert ! video_sink.video_0

We also had to comment out GST_BUFFER_DTS(buffer) = (GstClockTime)pts; in video_renderer_render_buffer

The scope of our requirements changed slightly but we're using webrtc however this can be done the same way using dashsink

fduncanh commented 1 year ago

sounds interesting!