Closed snwowolf20170103 closed 3 months ago
Hello,
Yes, you can stream byte array if that array are in h264 or h265. This is an example for h264: https://github.com/pedroSG94/RootEncoder/issues/1033#issuecomment-1008749330 Is idr mean that the bytearray is a keyframe. If you don't know it you can check it this way: https://github.com/pedroSG94/RootEncoder/blob/master/library/src/main/java/com/pedro/library/base/recording/BaseRecordController.java#L94
Thank you for your response. I implemented the local RTSP service and streaming service like this:
textureView = new TextureView(this);
textureView.setSurfaceTextureListener(this);
genericCamera1 = new CustomGenericCamera1(textureView, this);
rtspServerCamera1 = new RtspServerCamera1(textureView, this, 1938);
rtspServerCamera1.getStreamClient().setOnlyVideo(true);
simulateNv21Data();
if (prepare()) {
genericCamera1.startStream("rtsp://111.53.214.99:554/iphone/stream");
} else {
Log.d(TAG, "onCreate: Error preparing stream, This device can't do it");
}
if (prepareServer()) {
rtspServerCamera1.startStream();
}
By the way,
1 simulateNv21Data() :
genericCamera1.inputYUVData(nv21);
rtspServerCamera1.inputYUVData(nv21);
2 inputYUVData() :
fun inputYUVData(nv21: ByteArray) {
val frame = Frame(nv21,0, false,ImageFormat.NV21,System.nanoTime() / 1000);
videoEncoder.inputYUVData(frame)
}
Is this approach correct?
Hello,
Yes, the idea is correct. But why RtspServer and GenericCamera1 together? Normally you only use one of that remember that in few devices you only can use one instance of camera and microphone at the same time
Thank you!!! The reason for RtspServer and GenericCamera1 together is to serve different purposes. GenericCamera1 is implemented to push data to another server, while RtspServer is utilized to turn the current device into an RTSP service with the added functionality of watermarking. The current video delay is about 7 seconds. Do you have any good solutions for this?
Which player are you using?. Normally the delay is in player side. This library should have less than 1s of delay but server and player also add delay
Alright, got it. Thank you for your response.
Alright, got it. Thank you for your response.
Hi @pedroSG94,
Is there a way to livestream using the latest library with a raw byte array as the data source from another app via shared memory?