pedroSG94 / RTSP-Server

Plugin of rtmp-rtsp-stream-client-java to stream directly to RTSP player.
Apache License 2.0
214 stars 66 forks source link

Livestreaming with raw ByteArray #126

Closed snwowolf20170103 closed 3 months ago

snwowolf20170103 commented 4 months ago

Hi @pedroSG94,

Is there a way to livestream using the latest library with a raw byte array as the data source from another app via shared memory?

pedroSG94 commented 4 months ago

Hello,

Yes, you can stream byte array if that array are in h264 or h265. This is an example for h264: https://github.com/pedroSG94/RootEncoder/issues/1033#issuecomment-1008749330 Is idr mean that the bytearray is a keyframe. If you don't know it you can check it this way: https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/base/recording/BaseRecordController.java#L94

snwowolf20170103 commented 3 months ago

Thank you for your response. I implemented the local RTSP service and streaming service like this:

    textureView = new TextureView(this);
    textureView.setSurfaceTextureListener(this);
    genericCamera1 = new CustomGenericCamera1(textureView, this);

    rtspServerCamera1 = new RtspServerCamera1(textureView, this, 1938);
    rtspServerCamera1.getStreamClient().setOnlyVideo(true);

    simulateNv21Data();

    if (prepare()) {
        genericCamera1.startStream("rtsp://111.53.214.99:554/iphone/stream");
    } else {
        Log.d(TAG, "onCreate: Error preparing stream, This device can't do it");
    }

    if (prepareServer()) {
        rtspServerCamera1.startStream();
    }

By the way, 1 simulateNv21Data() : genericCamera1.inputYUVData(nv21); rtspServerCamera1.inputYUVData(nv21);
2 inputYUVData() : fun inputYUVData(nv21: ByteArray) { val frame = Frame(nv21,0, false,ImageFormat.NV21,System.nanoTime() / 1000); videoEncoder.inputYUVData(frame) }

Is this approach correct?

pedroSG94 commented 3 months ago

Hello,

Yes, the idea is correct. But why RtspServer and GenericCamera1 together? Normally you only use one of that remember that in few devices you only can use one instance of camera and microphone at the same time

snwowolf20170103 commented 3 months ago

Thank you!!! The reason for RtspServer and GenericCamera1 together is to serve different purposes. GenericCamera1 is implemented to push data to another server, while RtspServer is utilized to turn the current device into an RTSP service with the added functionality of watermarking. The current video delay is about 7 seconds. Do you have any good solutions for this?

pedroSG94 commented 3 months ago

Which player are you using?. Normally the delay is in player side. This library should have less than 1s of delay but server and player also add delay

snwowolf20170103 commented 3 months ago

Alright, got it. Thank you for your response.

snwowolf20170103 commented 3 months ago

Alright, got it. Thank you for your response.