pedroSG94 / RootEncoder

RootEncoder for Android (rtmp-rtsp-stream-client-java) is a stream encoder to push video/audio to media servers using protocols RTMP, RTSP, SRT and UDP with all code written in Java/Kotlin
Apache License 2.0
2.58k stars 781 forks source link

RTSP - MISSING NAL - WOWZA Streaming engine #965

Closed biviel closed 9 months ago

biviel commented 3 years ago

Hi, @pedroSG94 , I really appreciate your work on this library, especially maintenance and support you do here!

I'm working on an application and used this library to do RTMP live streaming toward a wowza streaming engine. Recently I started to switch to try RTSP as it may encode into h265 and server should support it fine.

At the end, after I setup and rtspclient I'm facing an issue, when I start streaming to the server URL. On the server side I see warnings like: RTPTrack.getCodecConfig(video): Missing NAL SPS(7) , RTPTrack.getCodecConfig(video): Missing NAL PPS(8) then RTPTrack.checkRTCPSSRC[RTPStream={streamcontext=livestream/_definst_/myStream,mode=PUBLISH,uuid=925522951},RTPTrack={ourSsrc=1151399308/audio}]: ssrc error: expected:da45a80e, got:da6a30d6 from host:null

and RTPTrack.checkRTCPSSRC[RTPStream={streamcontext=livestream/_definst_/myStream,mode=PUBLISH,uuid=925522951},RTPTrack={ourSsrc=1051487197/video}]: ssrc error: expected:cf3d0aac, got:c51b90a6 from host:null

from android client I can see

I/OpenGlViewBase: Thread started. I/AudioEncoder: AudioEncoder started I/MicrophoneManager: Microphone started I/RtspClient: OPTIONS rtsp://my.domain.com:1935/livestream/myStream RTSP/1.0 CSeq: 1 I/RtspClient: RTSP/1.0 200 OK CSeq: 1 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, OPTIONS, ANNOUNCE, RECORD, GET_PARAMETER Supported: play.basic, con.persistent ANNOUNCE rtsp://my.domain.com:1935/livestream/myStream RTSP/1.0 CSeq: 2 Content-Length: 448 Content-Type: application/sdp

v=0
o=- 0 0 IN IP4 127.0.0.1
s=Unnamed
i=N/A
c=IN IP4 feeder.flow.tours
t=0 0
a=recvonly
m=video 0 RTP/AVP 96
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1;sprop-parameter-sets=WjBLQUh0b0hnVVpB,YU00TmlBPT0=;
a=control:trackID=1
m=audio 0 RTP/AVP 96
a=rtpmap:96 MPEG4-GENERIC/32000/2
a=fmtp:96 streamtype=5; profile-level-id=15; mode=AAC-hbr; config=1290; SizeLength=13; IndexLength=3; IndexDeltaLength=3;
a=control:trackID=0

I/RtspClient: RTSP/1.0 200 OK CSeq: 2 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Session: 1428659075;timeout=60 SETUP rtsp://my.domain.com:1935/livestream/myStream/trackID=0 RTSP/1.0 Transport: RTP/AVP/TCP;interleaved=0-1;mode=record CSeq: 3 Session: 1428659075 I/RtspClient: RTSP/1.0 200 OK CSeq: 3 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Expires: Tue, 2 Nov 2021 22:38:04 UTC Transport: RTP/AVP/TCP;interleaved=0-1;mode=record Date: Tue, 2 Nov 2021 22:38:04 UTC Session: 1428659075;timeout=60 SETUP rtsp://my.domain.com:1935/livestream/myStream/trackID=1 RTSP/1.0 Transport: RTP/AVP/TCP;interleaved=2-3;mode=record CSeq: 4 Session: 1428659075 I/RtspClient: RTSP/1.0 200 OK CSeq: 4 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Expires: Tue, 2 Nov 2021 22:38:04 UTC Transport: RTP/AVP/TCP;interleaved=2-3;mode=record Date: Tue, 2 Nov 2021 22:38:04 UTC Session: 1428659075;timeout=60 I/RtspClient: RECORD rtsp://my.domain.com:1935/livestream/myStream RTSP/1.0 Range: npt=0.000- CSeq: 5 Session: 1428659075 I/RtspClient: RTSP/1.0 200 OK CSeq: 5 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Range: npt=now- Session: 1428659075;timeout=60 I/BaseRtpSocket: wrote packet: Audio, size: 381 I/BaseSenderReport: wrote report: Audio, packets: 1, octet: 381 I/MainActivity: at tours.flow.hdrstreaming(null:-1) . [run] --- Rtmp サーバーへ 接続成功しました。 D/ACodec: dataspace changed to 0x10c60000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:6(BT2020), T:3(SMPTE_170M)) I/BaseRtpSocket: wrote packet: Audio, size: 382 I/BaseRtpSocket: wrote packet: Audio, size: 391 I/BaseRtpSocket: wrote packet: Audio, size: 391 I/BaseRtpSocket: wrote packet: Audio, size: 392 I/BaseRtpSocket: wrote packet: Video, size: 38 I/BaseSenderReport: wrote report: Video, packets: 1, octet: 38 I/BaseRtpSocket: wrote packet: Audio, size: 398 I/BaseRtpSocket: wrote packet: Audio, size: 398 I/BaseRtpSocket: wrote packet: Audio, size: 400 wrote packet: Audio, size: 388 I/BaseRtpSocket: wrote packet: Video, size: 37 I/BaseRtpSocket: wrote packet: Video, size: 1272

I can't see video when I try to play via player, which works fine when streaming to same server from tools for android on another device. This is an older android 7.1 device (sdk api is 25) and I'm using an older library of yours because of limitations of this device (it's an android based Theta Z1 360 degree camera I'm playing with).

I'm using library 1.4.8, even upgrading to this version and make it work took some efforts because of compatibility issues I was facing with...

Thanks for your thoughts and help in advance!

Regards, Laszlo

biviel commented 3 years ago

Hi, forgot to higlhight that this device has no actual screen, trying to use openglview which worked well with RTMP, there is an rtmpextendbase class and rtmpextend using surface...

pedroSG94 commented 3 years ago

Hello,

First of all I recommend you compile app example and test it to know if the bug is already resolved because I did so much changes from 1.4.8. Even If you have no screen you can use a program to mirror screen device like Vysor and check if all is working fine (I think that fromfile or display example should work even if you haven't access to camera api).

For now I can tell you this:

For this reason I think that both errors are fixed so try test it and let me know if your error persist or you have other error. If you have any error share me a logcat.

biviel commented 3 years ago

hi, @pedroSG94 , thanks for your quick answer!

It was a big effort to make version 1.4.8 work. :-) I was merging release by release to find why next release wasn't working. I started at version 1.2.9. and I got a code to start with. So please, could you give me any hint or help to try to make it work with this version?

Today I managed to make it work and stream audio fine, but video/screen was black when playing in my player.

I use Vysor and I'm able to see actual image through it when running, but I'm missing the video stream. I feel like I'm close to make this cam work... If not in h.265, but in h.264 first...

I attached the RtspExtend.java and RtspExtendBase.java files, which were migrated from rtmpextend and rtmpextendbase classes, so you may notice that srsFlvMuxer I "replaced" with rtspclient. In startstreamurl method I execut eonly rtspclient.setUrl(url);

Look at these classes please: rtspextend.txt rtspextendbase.txt

In examples I'm not able to find similar aproach, could you please have a look if you see anything that could help me?

So now audio plays fine via my server, but video screen remains black. This is what I see in android:

I see a line in sdp: m=video 0 RTP/AVP 96

This is that portion in log:

I/OpenGlViewBase: Thread started. I/AudioEncoder: AudioEncoder started I/MicrophoneManager: Microphone started D/ACodec: dataspace changed to 0x10c60000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:6(BT2020), T:3(SMPTE_170M)) I/RtspClient: OPTIONS rtsp://my.domain.tours:1935/livestream/myStream RTSP/1.0 CSeq: 1 I/RtspClient: RTSP/1.0 200 OK CSeq: 1 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, OPTIONS, ANNOUNCE, RECORD, GET_PARAMETER Supported: play.basic, con.persistent ANNOUNCE rtsp://my.domain.tours:1935/livestream/myStream RTSP/1.0 CSeq: 2 Content-Length: 448 Content-Type: application/sdp

v=0
o=- 0 0 IN IP4 127.0.0.1
s=Unnamed
i=N/A
c=IN IP4 feeder.flow.tours
t=0 0
a=recvonly
m=video 0 RTP/AVP 96
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1;sprop-parameter-sets=WjBLQUh0b0hnVVpB,YU00TmlBPT0=;
a=control:trackID=1
m=audio 0 RTP/AVP 96
a=rtpmap:96 MPEG4-GENERIC/32000/2
a=fmtp:96 streamtype=5; profile-level-id=15; mode=AAC-hbr; config=1290; SizeLength=13; IndexLength=3; IndexDeltaLength=3;
a=control:trackID=0

I/RtspClient: RTSP/1.0 200 OK CSeq: 2 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Session: 858771865;timeout=60 I/RtspClient: SETUP rtsp://my.domain.tours:1935/livestream/myStream/trackID=0 RTSP/1.0 Transport: RTP/AVP/TCP;interleaved=0-1;mode=record CSeq: 3 Session: 858771865 I/RtspClient: RTSP/1.0 200 OK CSeq: 3 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Expires: Wed, 3 Nov 2021 08:50:14 UTC Transport: RTP/AVP/TCP;interleaved=0-1;mode=record Date: Wed, 3 Nov 2021 08:50:14 UTC Session: 858771865;timeout=60 SETUP rtsp://my.domain.tours:1935/livestream/myStream/trackID=1 RTSP/1.0 Transport: RTP/AVP/TCP;interleaved=2-3;mode=record CSeq: 4 Session: 858771865 I/RtspClient: RTSP/1.0 200 OK CSeq: 4 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Expires: Wed, 3 Nov 2021 08:50:14 UTC Transport: RTP/AVP/TCP;interleaved=2-3;mode=record Date: Wed, 3 Nov 2021 08:50:14 UTC Session: 858771865;timeout=60 RECORD rtsp://my.domain.tours:1935/livestream/myStream RTSP/1.0 Range: npt=0.000- CSeq: 5 Session: 858771865 I/RtspClient: RTSP/1.0 200 OK CSeq: 5 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Range: npt=now- Session: 858771865;timeout=60 I/MainActivity: at tours.flow.hdrstreaming(null:-1) . [run] --- Rtmp サーバーへ 接続成功しました。 I/BaseRtpSocket: wrote packet: Audio, size: 407 I/BaseSenderReport: wrote report: Audio, packets: 1, octet: 407 I/BaseRtpSocket: wrote packet: Audio, size: 411 I/BaseRtpSocket: wrote packet: Audio, size: 386 I/BaseRtpSocket: wrote packet: Audio, size: 390 I/BaseRtpSocket: wrote packet: Video, size: 1272 I/BaseSenderReport: wrote report: Video, packets: 1, octet: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 981 I/BaseRtpSocket: wrote packet: Audio, size: 389 wrote packet: Audio, size: 392 I/BaseRtpSocket: wrote packet: Audio, size: 390 I/BaseRtpSocket: wrote packet: Audio, size: 395 I/BaseRtpSocket: wrote packet: Video, size: 1272

Thanks in advance for your inputs!

Laszlo

pedroSG94 commented 3 years ago

It seems that you are sending video to server but keyframe is missed. Try produce keyframe with more frequency to know if it is the case. In prepareVideo method (rtspextendbase) set iFrameInterval to 2 instead of 20 (one key frame each 2 second that is a normal value). If the error still persist set it to 0 (all frame are keyframe) this last is only to know if the problem is a missed keyframe but you never should set as 0 because bandwidth increase so much.

Also, if you have library manually downloaded without gradle from jitpack. You can try download only rtsp module and use it.

biviel commented 3 years ago

hi, I tried to set to 0, but no difference, still video is empty.

About download are you suggesting to try to download only the latest RTSP module and rest to keep as is at version 1.4.8?

Thanks!

pedroSG94 commented 3 years ago

About download are you suggesting to try to download only the latest RTSP module and rest to keep as is at version 1.4.8?

Yes, if possible try it and let me know if it is working. If not share me a logcat

biviel commented 3 years ago

hi, I downloaded rtsp only from 2.1.3 version and manually moved, adjusted build.gradle and built. By vysor I can see the camera image, but when I start to stream it streams only the audio, screen is black.

Untitled

as openglviewbase I had to add init() as visible on image, inside start(), otherwise wasn't able to run, cam crashed and couldnt start. I did that change earlier during upgrades from library 1.2.9 to 1.4.8, which was working fine. This issue I'm facing may be related to encoder/rtplibrary modules?

logcat here: I/OpenGlViewBase: Thread started. I/AudioEncoder: AudioEncoder started I/MicrophoneManager: Microphone started D/ACodec: dataspace changed to 0x10c60000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:6(BT2020), T:3(SMPTE_170M)) I/RtspClient: OPTIONS rtsp://my.domain.com:1935/livestream/myStream RTSP/1.0 CSeq: 1 I/RtspClient: RTSP/1.0 200 OK CSeq: 1 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, OPTIONS, ANNOUNCE, RECORD, GET_PARAMETER Supported: play.basic, con.persistent I/RtspClient: ANNOUNCE rtsp://my.domain.com:1935/livestream/myStream RTSP/1.0 CSeq: 2 Content-Length: 448 Content-Type: application/sdp

v=0
o=- 0 0 IN IP4 127.0.0.1
s=Unnamed
i=N/A
c=IN IP4 feeder.flow.tours
t=0 0
a=recvonly
m=video 0 RTP/AVP 96
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1;sprop-parameter-sets=WjBLQUh0b0hnVVpB,YU00TmlBPT0=;
a=control:trackID=1
m=audio 0 RTP/AVP 96
a=rtpmap:96 MPEG4-GENERIC/32000/2
a=fmtp:96 streamtype=5; profile-level-id=15; mode=AAC-hbr; config=1290; SizeLength=13; IndexLength=3; IndexDeltaLength=3;
a=control:trackID=0

I/RtspClient: RTSP/1.0 200 OK CSeq: 2 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Session: 724302632;timeout=60 I/RtspClient: SETUP rtsp://my.domain.com:1935/livestream/myStream/trackID=0 RTSP/1.0 Transport: RTP/AVP/TCP;interleaved=0-1;mode=record CSeq: 3 Session: 724302632 I/RtspClient: RTSP/1.0 200 OK CSeq: 3 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Expires: Wed, 3 Nov 2021 10:44:23 UTC Transport: RTP/AVP/TCP;interleaved=0-1;mode=record Date: Wed, 3 Nov 2021 10:44:23 UTC Session: 724302632;timeout=60 SETUP rtsp://my.domain.com:1935/livestream/myStream/trackID=1 RTSP/1.0 Transport: RTP/AVP/TCP;interleaved=2-3;mode=record CSeq: 4 Session: 724302632 I/RtspClient: RTSP/1.0 200 OK CSeq: 4 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Expires: Wed, 3 Nov 2021 10:44:23 UTC Transport: RTP/AVP/TCP;interleaved=2-3;mode=record Date: Wed, 3 Nov 2021 10:44:23 UTC Session: 724302632;timeout=60 RECORD rtsp://my.domain.com:1935/livestream/myStream RTSP/1.0 Range: npt=0.000- CSeq: 5 Session: 724302632 I/RtspClient: RTSP/1.0 200 OK CSeq: 5 Server: Wowza Streaming Engine 4.8.13+1 build20210527172944 Cache-Control: no-cache Range: npt=now- Session: 724302632;timeout=60 I/BaseRtpSocket: wrote packet: Video, size: 37 I/BaseSenderReport: wrote report: Video, packets: 1, octet: 37 I/BaseRtpSocket: wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 wrote packet: Video, size: 1272 I/BaseRtpSocket: wrote packet: Video, size: 1272

pedroSG94 commented 3 years ago

So the new rtsp module doesn't resolve the problem? This is weird because I can't see this log: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/e80d772e2272e4d32c562c83b13f66d08b777489/rtsp/src/main/java/com/pedro/rtsp/rtsp/RtspClient.kt#L124 This must be called to success connection and I'm using streamid instead of trackID in ANNOUNCE and SETUP commands so I think that you are not using 2.1.3 rtsp module.

I don't think that encoder/rtplibrary modules are related because rtmp is working fine and the only difference beetween rtmp and rtsp is rtmp/rtsp module itself. Also, we should comfirm that my library version 2.1.3 with rtsp is working fine with wowza to discard that any wowza update/library update broke it. Try compile app example with a normal device and check OpenGlView RTSP example that is the most closed implementation to your actual case.

biviel commented 3 years ago

hi! you are right, I'm trying now to copy the latest RTSP, facing some issues as it's kotlin based and its not setup in my project. will come back to you ASAP!

biviel commented 3 years ago

hi, @pedroSG94 , I can confirm that on mobiles I tested, the library worked fine with my wowza server... So the issue I'm facing is mobile/camera specific.

I'm now running library 1.9.0 but earlier versions were also failing, actually all of them when I run app just srashex, exists without error message at openglviewbas.java:

  @Override
  public void start() {
    synchronized (sync) {
      Log.i(TAG, "Thread started.");
      thread = new Thread(this, "glThread");
      running = true;
      thread.start();
      semaphore.acquireUninterruptibly();
    }
  }

In versions prior 1.4.9 camera was stopping the same way, I could make it work in earlier versions by adding init() there, before creating the thread, like this:

@Override
  public void start() {
    synchronized (sync) {
      Log.i(TAG, "Thread started.");
      init();
     thread = new Thread(this, "glThread");
      running = true;
      thread.start();
      semaphore.acquireUninterruptibly();
    }
  }

By adding this init() in earlier library versions rtmp streaming was working fine and now also without it this is only in logcat, it stopps right after gl thread is created:

I/VideoEncoder: prepared I/OpenGlViewBase: Thread started.

BUT if I add there init() it will work a bit further and I can see via Vysor the camera image for about 0.5 seconds visisble right before it crashes, logcat looks like this:

I/VideoEncoder: prepared I/OpenGlViewBase: Thread started. I/VideoEncoder: started I/MediaCodec: MediaCodec will operate in async mode I/AudioEncoder: started I/MediaCodec: MediaCodec will operate in async mode I/MicrophoneManager: Microphone started D/ACodec: dataspace changed to 0x10c60000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:6(BT2020), T:3(SMPTE_170M)) I/MicrophoneManager: Microphone stopped I/MainActivity: at tours.flow.hdrstreaming(MainActivity.java:733) . [run] --- Rtmpサーバーから切断しました。 I/VideoEncoder: stopped I/AudioEncoder: stopped

MainActivity.java:733 is executed onDisconnectRtsp()...

pedroSG94 commented 3 years ago

I can't see any crash in logs. If you mean this I don't understand chinese and I don't know what is line 733:

I/MainActivity: at tours.flow.hdrstreaming(MainActivity.java:733) . [run] --- Rtmpサーバーから切断しました。

Anyway I did a little modification to RtspExtendedBase to work with last library versión. The other class seem fine. I let you fews TODO with questions or suggestions about the code. Try use this class and update to 2.1.3:

/**
 * Copyright 2018 Ricoh Company, Ltd.
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package com.pedro.rtpstreamer.theta;

import android.content.Context;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Build;

import com.pedro.encoder.Frame;
import com.pedro.encoder.audio.AudioEncoder;
import com.pedro.encoder.audio.GetAacData;
import com.pedro.encoder.input.audio.GetMicrophoneData;
import com.pedro.encoder.input.audio.MicrophoneManager;
import com.pedro.encoder.input.decoder.AudioDecoderInterface;
import com.pedro.encoder.input.video.Camera1ApiManager;
import com.pedro.encoder.input.video.GetCameraData;
import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.OpenGlView;

import java.nio.ByteBuffer;
import java.util.HashMap;

/**
 * Encode video and audio
 */
public abstract class RtspExtendBase implements GetMicrophoneData, AudioDecoderInterface, GetAacData, GetCameraData, GetVideoData {

  protected Context context;
  protected Camera1ApiManager cameraManager; //TODO can we remove camera manager? The source video seems supposed to be from render OpenGlView
  protected MicrophoneManager microphoneManager;
  protected VideoEncoder videoEncoder;
  protected AudioEncoder audioEncoder;
  protected OpenGlView openGlView;
  protected boolean streaming;
  //TODO can we remove all this? you are not using recording feature
  protected MediaMuxer mediaMuxer;
  protected int videoTrack = -1;
  protected int audioTrack = -1;
  protected boolean recording = false;
  protected boolean canRecord = false;
  protected boolean onPreview = false;
  protected MediaFormat videoFormat;
  protected MediaFormat audioFormat;

  private HashMap streamingParamMap;

  /**
   * Constructor
   * @param  openGlView     OpneGLView of pedro library
   */
  public RtspExtendBase(OpenGlView openGlView) {
    this.context = openGlView.getContext();
    this.openGlView = openGlView;
    this.videoEncoder = new VideoEncoder(this);
    this.microphoneManager = new MicrophoneManager(this);
    this.audioEncoder = new AudioEncoder(this);
    this.streaming = false;
  }

  protected abstract void startStreamRtp(String var1);

  protected abstract void stopStreamRtp();

  public boolean isStreaming() {
    return this.streaming;
  }

  protected abstract void getAacDataRtp(ByteBuffer var1, MediaCodec.BufferInfo var2);

  /**
   * Acquire encoded audio data
   *
   * @param aacBuffer out/Audio data
   * @param info out/Information of audio data, e.g. sampling rate, etc.
   */
  public void getAacData(ByteBuffer aacBuffer, MediaCodec.BufferInfo info) {
    if (Build.VERSION.SDK_INT >= 18 && this.recording && this.audioTrack != -1 && this.canRecord) {
      this.mediaMuxer.writeSampleData(this.audioTrack, aacBuffer, info);
    }
    this.getAacDataRtp(aacBuffer, info);
  }

  protected abstract void getH264DataRtp(ByteBuffer var1, MediaCodec.BufferInfo var2);

  /**
   * Acquire video data
   * Actually, still images in H.264 format
   *
   * @param h264Buffer out/video data
   * @param info out/Information of video data, e.g. frame rate, etc.
   */
  public void getH264Data(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
    if (Build.VERSION.SDK_INT >= 18 && this.recording && this.videoTrack != -1) {
      if (info.flags == 1) {
        this.canRecord = true;
      }

      if (this.canRecord) {
        this.mediaMuxer.writeSampleData(this.videoTrack, h264Buffer, info);
      }
    }

    this.getH264DataRtp(h264Buffer, info);
  }

  protected abstract void onSPSandPPSRtp(ByteBuffer var1, ByteBuffer var2);

  //TODO check Frame class constructors to know how to create a YUV Frame but I think that you have video source rendering OpenGlView so you can remove this method
  /**
   * Setting of pixel data before encoding
   * YUV format(Color difference)
   *
   * @param frame Pixel data
   */
  public void inputYUVData(Frame frame) {
    this.videoEncoder.inputYUVData(frame);
  }

  //TODO check Frame class constructors to know how to create a PCM Frame
  /**
   * Setting of audio data before encoding
   *
   */
  @Override
  public void inputPCMData(Frame frame) {
    audioEncoder.inputPCMData(frame);
  }

  /**
   * SPS and PPS configuration callback
   *
   * @param sps (Sequence Parameter Set)
   * @param pps (Picture Parameter Set)
   */
  public void onSPSandPPS(ByteBuffer sps, ByteBuffer pps) {
    this.onSPSandPPSRtp(sps, pps);
  }

  /**
   * Movie format setting callback
   *
   * @param mediaFormat Media format
   */
  public void onVideoFormat(MediaFormat mediaFormat) {
    this.videoFormat = mediaFormat;
  }

  /**
   * Audio format setup callback
   *
   * @param mediaFormat Media format
   */
  public void onAudioFormat(MediaFormat mediaFormat) {
    this.audioFormat = mediaFormat;
  }

  /**
   * Encode and start sending to server
   *
   * @param url Server URL
   */
  public void startStream(String url) {
    if (openGlView != null && Build.VERSION.SDK_INT >= 18) {
      openGlView.init();
      openGlView.setEncoderSize(videoEncoder.getWidth(), videoEncoder.getHeight());
      openGlView.start();
      openGlView.addMediaCodecSurface(videoEncoder.getInputSurface());
    }

    startStreamRtp(url);

    videoEncoder.start();
    audioEncoder.start();
    microphoneManager.start();
    streaming = true;
  }

  /**
   * Stop encoding and server transmission
   */
  public void stopStream() {
    microphoneManager.stop();
    this.stopStreamRtp();
    this.videoEncoder.stop();
    this.audioEncoder.stop();
    if (this.openGlView != null && Build.VERSION.SDK_INT >= 18) {
      this.openGlView.removeMediaCodecSurface();
      this.openGlView.stop();
    }
    this.streaming = false;
  }

  /**
   * Set parameter list of streaming
   *
   * @param value Information such as vertical and horizontal sizes
   */
  public void setStreamingParamMap(HashMap value) {
    this.streamingParamMap = value;
  }

  /**
   * Get parameter list of streaming
   *
   * @return Information such as vertical and horizontal sizes
   */
  private HashMap getStreamingParamMap() {
    return streamingParamMap;
  }

  /**
   * Movie preparation
   *
   * @return Success: true, failed: false
   */
  public boolean prepareVideo() {
    if (this.openGlView == null) {
      //TODO Why this? as I can see OpenGlView is supposed to be renderer externally. The else case should be used always
      this.cameraManager.start(3840,1920,20); //prepareCamera();
      return this.videoEncoder.prepareVideoEncoder();
    } else {
      return this.videoEncoder.prepareVideoEncoder(
//                Integer.parseInt(getStreamingParamMap().get("width").toString()),
          //              Integer.parseInt(getStreamingParamMap().get("height").toString()),
          Integer.parseInt("3840"),
          Integer.parseInt("2160"),
          Integer.parseInt(getStreamingParamMap().get("fps").toString()),
          Integer.parseInt(getStreamingParamMap().get("bitrate").toString()),
          0, 2,FormatVideoEncoder.SURFACE);
    }
  }

  protected abstract void prepareAudioRtp(boolean isStereo, int sampleRate);

  /**
   * Audio preparation
   *
   * @return Success: true, failed: false
   */
  public boolean prepareAudio() {
    microphoneManager.createMicrophone(44100, false, false, false);
    // If the argument is omitted, 128 * 1024, 44100, true, false, false
    return this.audioEncoder.prepareAudioEncoder(128 * 1024, 44100, false, 0);
  }

  //TODO can we remove this? It is unused
  /**
   * Audio decode end callback
   *
   */
  @Override
  public void onAudioDecoderFinished() {
  }

}
biviel commented 3 years ago

Hi, thanks I will look into this. In logcat it's not visible but the app itself exists on this camera... No error found in logcat. I will try to see there is another log...

I'm having issues to setup my project with cotlin, 2.1.3 build.gradle has changed a lot compared to 1.8.0...

will try...

biviel commented 3 years ago

I will also try again a very old version if I can identify some code changes that are causing incompatibility... I managed that with RTMP, so that additional init() command made it work on this cam.

You can see a test live stream I recorded: https://www.flow.tours/en/product/33?zone=Europe/Berlin thats RTMP and a library version 1.2.9 I think.

this cam would be very good for live streaming VR360 if can do h265 and RTSP with your library. However it makes it difficult as it has its own manipulated firmware/android which has an influence on handling/programming camera, etc.

pedroSG94 commented 3 years ago

You don't need configure kotlin in your project. You can download library using gradle as in README and use it on java code. It is the same if you want donwload only rtsp module. You can download it if you replace rtplibrary to rtsp (it is the same for all modules).

Remember delete the old version to avoid problems. Let me know if you have any problem doing library update. I really recommend you at least donwload rtsp actual module and work with that module and old library version of encoder/rtplibrary.

If it is totally impossible I will check the rtsp module of your version (I'm not sure if 1.2.9 or 1.4.8, let me know it)

biviel commented 3 years ago

hi, @pedroSG94 , I'm looking into this, I think it's related how camera is handled that's causing app to stop/exit without error msg... The firmware/andorid is a bit strange for sure on this device. Will come back to you, hopefully sooner than later!

Thanks!

biviel commented 3 years ago

hi, @pedroSG94 , finally I was able to build my project by adding implementation 'com.github.pedroSG94.rtmp-rtsp-stream-client-java:rtplibrary:2.1.3' to dependencies at the end, also manually I tried and it worked with some modifications. So building now on 2.1.3, which is great! But facing issues now at both streaming RTMP and RTSP... :(

In older versions RTMP was working fine, so I tried also to stream RTMP now, there are also rtmpextend and rtmpextendbase classes, I'm attaching them here as reference. I replaced the necessary methods and using rtmpclient for srsFlvMuxer, but I may miss something there, would you be kind and look at it, please?

In Vysor I can see the image fine, updating frames, etc. as expected, but streaming fails...

At the bottom I attach the logcat... it's just waiting for SPS, etc.

here are my classes:

/**
 * Copyright 2018 Ricoh Company, Ltd.
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package tours.flow.hdrstreaming.Extend;

import android.content.Context;
import android.media.MediaCodec;
import android.media.MediaFormat;
import android.media.MediaMuxer;
import android.os.Build;
import android.support.annotation.RequiresApi;

import com.pedro.encoder.Frame;
import com.pedro.encoder.audio.AudioEncoder;
import com.pedro.encoder.audio.GetAacData;
import com.pedro.encoder.input.audio.GetMicrophoneData;
import com.pedro.encoder.input.audio.MicrophoneManager;
import com.pedro.encoder.input.decoder.AudioDecoderInterface;
import com.pedro.encoder.input.video.Camera1ApiManager;
//import com.pedro.encoder.input.video.Frame;
import com.pedro.encoder.input.video.GetCameraData;
import com.pedro.encoder.video.FormatVideoEncoder;
import com.pedro.encoder.video.GetVideoData;
import com.pedro.encoder.video.VideoEncoder;
import com.pedro.rtplibrary.view.OpenGlView;
import java.nio.ByteBuffer;
import java.util.HashMap;

/**
 * Encode video and audio
 */
public abstract class RtmpExtendBase implements GetMicrophoneData, AudioDecoderInterface, GetAacData, GetCameraData, GetVideoData {

    protected Context context;
    protected Camera1ApiManager cameraManager;
    protected MicrophoneManager microphoneManager;
    protected VideoEncoder videoEncoder;
    protected AudioEncoder audioEncoder;
    protected OpenGlView openGlView;
    protected boolean streaming;
    protected MediaMuxer mediaMuxer;
    protected int videoTrack = -1;
    protected int audioTrack = -1;
    protected boolean recording = false;
    protected boolean canRecord = false;
    protected boolean onPreview = false;
    protected MediaFormat videoFormat;
    protected MediaFormat audioFormat;

    private HashMap streamingParamMap;

    @RequiresApi(
        api = 18
    )

    /**
     * Constructor
     * @param  openGlView     OpneGLView of pedro library
     */
    public RtmpExtendBase(OpenGlView openGlView) {
        this.context = openGlView.getContext();
        this.openGlView = openGlView;
        this.videoEncoder = new VideoEncoder(this);
        this.microphoneManager = new MicrophoneManager(this);
        this.audioEncoder = new AudioEncoder(this);
        this.streaming = false;
    }

    protected abstract void startStreamRtp(String var1);

    protected abstract void stopStreamRtp();

    public boolean isStreaming() {
        return this.streaming;
    }

    protected abstract void getAacDataRtp(ByteBuffer var1, MediaCodec.BufferInfo var2);

    /**
     * Acquire encoded audio data
     *
     * @param aacBuffer out/Audio data
     * @param info out/Information of audio data, e.g. sampling rate, etc.
     */
    public void getAacData(ByteBuffer aacBuffer, MediaCodec.BufferInfo info) {
        if (Build.VERSION.SDK_INT >= 18 && this.recording && this.audioTrack != -1 && this.canRecord) {
            this.mediaMuxer.writeSampleData(this.audioTrack, aacBuffer, info);
        }
        this.getAacDataRtp(aacBuffer, info);
    }

    protected abstract void getH264DataRtp(ByteBuffer var1, MediaCodec.BufferInfo var2);

    /**
     * Acquire video data
     * Actually, still images in H.264 format
     *
     * @param h264Buffer out/video data
     * @param info out/Information of video data, e.g. frame rate, etc.
     */
    public void getH264Data(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
        if (Build.VERSION.SDK_INT >= 18 && this.recording && this.videoTrack != -1) {
            if (info.flags == 1) {
                this.canRecord = true;
            }

            if (this.canRecord) {
                this.mediaMuxer.writeSampleData(this.videoTrack, h264Buffer, info);
            }
        }

        this.getH264DataRtp(h264Buffer, info);
    }

    protected abstract void onSPSandPPSRtp(ByteBuffer var1, ByteBuffer var2);

    /**
     * Setting of pixel data before encoding
     * YUV format(Color difference)
     *
     * @param frame Pixel data
     */
    public void inputYUVData(Frame frame) {
        this.videoEncoder.inputYUVData(frame);
    }

    /**
     * Setting of audio data before encoding
     *
     * @param frame
     */
    @Override
    public void inputPCMData(Frame frame) {
        audioEncoder.inputPCMData(frame);
    }

    /**
     * SPS and PPS configuration callback
     *
     * @param sps (Sequence Parameter Set)
     * @param pps (Picture Parameter Set)
     */
    public void onSPSandPPS(ByteBuffer sps, ByteBuffer pps) {
        this.onSPSandPPSRtp(sps, pps);
    }

    /**
     * Movie format setting callback
     *
     * @param mediaFormat Media format
     */
    public void onVideoFormat(MediaFormat mediaFormat) {
        this.videoFormat = mediaFormat;
    }

    /**
     * Audio format setup callback
     *
     * @param mediaFormat Media format
     */
    public void onAudioFormat(MediaFormat mediaFormat) {
        this.audioFormat = mediaFormat;
    }

    /**
     * Encode and start sending to server
     *
     * @param url Server URL
     */
    public void startStream(String url) {
        if (openGlView != null && Build.VERSION.SDK_INT >= 18) {
            openGlView.init();
            openGlView.setEncoderSize(videoEncoder.getWidth(), videoEncoder.getHeight());
            openGlView.start();
            openGlView.addMediaCodecSurface(videoEncoder.getInputSurface());
        }

        startStreamRtp(url);
        videoEncoder.start();
        audioEncoder.start();
        microphoneManager.start();
        streaming = true;
    }

    /**
     * Stop encoding and server transmission
     */
    public void stopStream() {
        microphoneManager.stop();
        this.stopStreamRtp();
        this.videoEncoder.stop();
        this.audioEncoder.stop();
        if (this.openGlView != null && Build.VERSION.SDK_INT >= 18) {
            this.openGlView.stop();
            this.openGlView.removeMediaCodecSurface();
        }
        this.streaming = false;
    }

    /**
     * Set parameter list of streaming
     *
     * @param value Information such as vertical and horizontal sizes
     */
    public void setStreamingParamMap(HashMap value) {
        this.streamingParamMap = value;
    }

    /**
     * Get parameter list of streaming
     *
     * @return Information such as vertical and horizontal sizes
     */
    private HashMap getStreamingParamMap() {
        return streamingParamMap;
    }

    /**
     * Movie preparation
     *
     * @return Success: true, failed: false
     */
    public boolean prepareVideo() {
        if (this.openGlView == null) {
            this.cameraManager.start(3840,1920,20); //prepareCamera();
            return this.videoEncoder.prepareVideoEncoder();
        } else {
            return this.videoEncoder.prepareVideoEncoder(
//                Integer.parseInt(getStreamingParamMap().get("width").toString()),
  //              Integer.parseInt(getStreamingParamMap().get("height").toString()),
                Integer.parseInt("3840"),
                Integer.parseInt("2160"),
                    Integer.parseInt(getStreamingParamMap().get("fps").toString()),
                Integer.parseInt(getStreamingParamMap().get("bitrate").toString()),
                0, 20,FormatVideoEncoder.SURFACE);
        }
    }

    protected abstract void prepareAudioRtp(boolean isStereo, int sampleRate);

    /**
     * Audio preparation
     *
     * @return Success: true, failed: false
     */
    public boolean prepareAudio() {
        microphoneManager.createMicrophone(44100, false, false, false);
        // If the argument is omitted, 128 * 1024, 44100, true, false, false
        return this.audioEncoder.prepareAudioEncoder(128 * 1024, 44100, false,0);
    }

    /**
     * Audio decode end callback
     *
     */
    @Override
    public void onAudioDecoderFinished() {
    }

}

and here you can see the RTMPEXTEND class:

/**
 * Copyright 2018 Ricoh Company, Ltd.
 *
 * Licensed under the Apache License, Version 2.0 (the "License");
 * you may not use this file except in compliance with the License.
 * You may obtain a copy of the License at
 *
 * http://www.apache.org/licenses/LICENSE-2.0
 *
 * Unless required by applicable law or agreed to in writing, software
 * distributed under the License is distributed on an "AS IS" BASIS,
 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
 * See the License for the specific language governing permissions and
 * limitations under the License.
 */

package tours.flow.hdrstreaming.Extend;

import android.media.MediaCodec;
import android.os.Build;
import android.support.annotation.RequiresApi;

//import com.pedro.encoder.input.video.Frame;
import com.pedro.encoder.Frame;
import com.pedro.rtplibrary.view.OpenGlView;
import java.nio.ByteBuffer;
//import net.ossrs.rtmp.ConnectCheckerRtmp;
//import net.ossrs.rtmp.SrsFlvMuxer;
import com.pedro.rtmp.utils.ConnectCheckerRtmp;
import com.pedro.rtmp.rtmp.RtmpClient;

/**
 * Transmits video and audio data (stream) in Flv container.
 */
public class RtmpExtend extends RtmpExtendBase {

    private RtmpClient srsFlvMuxer;

    /**
     * Constructor
     *
     * @param openGlView OpneGLView of pedro library
     * @param connectChecker Callback class called by connection state
     */
    @RequiresApi(api = Build.VERSION_CODES.JELLY_BEAN_MR2)
    public RtmpExtend(OpenGlView openGlView, ConnectCheckerRtmp connectChecker) {
        super(openGlView);
        srsFlvMuxer = new RtmpClient(connectChecker);
    }

    /**
     * Start sending to server
     *
     * @param url Server URL
     */
    @Override
    protected void startStreamRtp(String url) {
        if (videoEncoder.getRotation() == 90 || videoEncoder.getRotation() == 270) {
            srsFlvMuxer.setVideoResolution(videoEncoder.getHeight(), videoEncoder.getWidth());
        } else {
            srsFlvMuxer.setVideoResolution(videoEncoder.getWidth(), videoEncoder.getHeight());
        }

        srsFlvMuxer.connect(url);
    }

    /**
     * End transmission to server
     */
    @Override
    protected void stopStreamRtp() {
        srsFlvMuxer.disconnect();
    }

    /**
     * Send audio to server
     */
    @Override
    protected void getAacDataRtp(ByteBuffer aacBuffer, MediaCodec.BufferInfo info) {
        srsFlvMuxer.sendAudio(aacBuffer, info);
    }

    /**
     * Send video (H.264) to server
     */
    @Override
    protected void getH264DataRtp(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
        srsFlvMuxer.sendVideo(h264Buffer, info);
    }

    /**
     * Setting of SPS / PPS (data used in containers)
     *
     * @param sps (Sequence Parameter Set)
     * @param pps (Picture Parameter Set)
     */
    @Override
    protected void onSPSandPPSRtp(ByteBuffer sps, ByteBuffer pps) {
        //srsFlvMuxer.setSpsPPs(sps, pps);
    }

    @Override
    protected void prepareAudioRtp(boolean isStereo, int sampleRate) {
        //srsFlvMuxer.setIsStereo(isStereo);
        //srsFlvMuxer.setSampleRate(sampleRate);
    }

    @Override
    public void inputYUVData(Frame frame) {

    }

    @Override
    public void onSpsPpsVps(ByteBuffer sps, ByteBuffer pps, ByteBuffer vps) {
        //srsFlvMuxer.setSpsPPs(sps, pps);

    }

    @Override
    public void getVideoData(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
        srsFlvMuxer.sendVideo(h264Buffer, info);
    }

    @Override
    public void inputPCMData(Frame frame) {

    }
}

and here you can see th elogcat itself, this is for RTMP now, not RTSP, that also fails.

I/VideoEncoder: prepared
I/OpenGlViewBase: Thread started.
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
I/SurfaceManager: GL released
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
I/SurfaceManager: GL initialized
I/VideoEncoder: started
I/MediaCodec: MediaCodec will operate in async mode
I/AudioEncoder: started
I/MediaCodec: MediaCodec will operate in async mode
I/MicrophoneManager: Microphone started
I/Handshake: writing C0
    C0 write successful
    writing C1
    writing time 1636288385 to c1
    writing zero to c1
    writing random to c1
I/Handshake: C1 write successful
I/Handshake: reading S0
I/Handshake: read S0 successful
    reading S1
    read S1 successful
    writing C2
    C2 write successful
    reading S2
    read S2 successful
I/CommandsManager: using default write chunk size 128
I/CommandsManager: send Command(name='connect', transactionId=1, timeStamp=0, streamId=0, data=[AmfString value: connect, AmfNumber value: 1.0, AmfObject properties: {AmfString value: app=AmfString value: livestream, AmfString value: flashVer=AmfString value: FMLE/3.0 (compatible; Lavf57.56.101), AmfString value: swfUrl=AmfString value: , AmfString value: tcUrl=AmfString value: rtmp://feeder.flow.tours:1935/livestream, AmfString value: fpad=AmfBoolean value: false, AmfString value: capabilities=AmfNumber value: 239.0, AmfString value: pageUrl=AmfString value: , AmfString value: objectEncoding=AmfNumber value: 0.0}], bodySize=219)
I/CommandsManager: read WindowAcknowledgementSize(acknowledgementWindowSize=2500000)
I/CommandsManager: read SetPeerBandwidth(acknowledgementWindowSize=2500000, type=DYNAMIC)
I/CommandsManager: read UserControl(type=STREAM_BEGIN, event=Event(data=0, bufferLength=-1), bodySize=6)
I/RtmpClient: user control command STREAM_BEGIN ignored
I/CommandsManager: read SetChunkSize(chunkSize=4096)
I/RtmpClient: chunk size configured to 4096
I/CommandsManager: read Command(name='_result', transactionId=1, timeStamp=0, streamId=0, data=[AmfString value: _result, AmfNumber value: 1.0, AmfObject properties: {AmfString value: fmsVer=AmfString value: FMS/3,5,7,7009, AmfString value: capabilities=AmfNumber value: 31.0, AmfString value: mode=AmfNumber value: 1.0}, AmfObject properties: {AmfString value: level=AmfString value: status, AmfString value: code=AmfString value: NetConnection.Connect.Success, AmfString value: description=AmfString value: Connection succeeded., AmfString value: data=AmfEcmaArray length: 0, properties: {AmfString value: version=AmfString value: 3,5,7,7009}, AmfString value: clientid=AmfNumber value: 9.64704158E8, AmfString value: objectEncoding=AmfNumber value: 0.0}], bodySize=261)
I/CommandsManager: send Command(name='releaseStream', transactionId=2, timeStamp=0, streamId=0, data=[AmfString value: releaseStream, AmfNumber value: 2.0, AmfNull, AmfString value: myStream?biviel@gmail.com&heaven1], bodySize=62)
    send Command(name='FCPublish', transactionId=3, timeStamp=0, streamId=0, data=[AmfString value: FCPublish, AmfNumber value: 3.0, AmfNull, AmfString value: myStream?biviel@gmail.com&heaven1], bodySize=58)
I/CommandsManager: send Command(name='createStream', transactionId=4, timeStamp=0, streamId=0, data=[AmfString value: createStream, AmfNumber value: 4.0, AmfNull], bodySize=25)
I/RtmpClient: success response received from connect
I/CommandsManager: read Command(name='onFCPublish', transactionId=0, timeStamp=0, streamId=0, data=[AmfString value: onFCPublish, AmfNumber value: 0.0, AmfNull, AmfObject properties: {AmfString value: level=AmfString value: status, AmfString value: code=AmfString value: NetStream.Publish.Start, AmfString value: description=AmfString value: FCPublish to stream myStream., AmfString value: clientid=AmfNumber value: 9.64704158E8}], bodySize=140)
I/RtmpClient: unknown onFCPublish response received from unknown command
I/CommandsManager: read Command(name='_result', transactionId=4, timeStamp=0, streamId=0, data=[AmfString value: _result, AmfNumber value: 4.0, AmfNull, AmfNumber value: 1.0], bodySize=29)
I/CommandsManager: send Command(name='publish', transactionId=5, timeStamp=0, streamId=1, data=[AmfString value: publish, AmfNumber value: 5.0, AmfNull, AmfString value: myStream?biviel@gmail.com&heaven1, AmfString value: live], bodySize=63)
I/RtmpClient: success response received from createStream
D/ACodec: dataspace changed to 0x10c60000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:6(BT2020), T:3(SMPTE_170M))
I/CommandsManager: read UserControl(type=STREAM_BEGIN, event=Event(data=1, bufferLength=-1), bodySize=6)
I/RtmpClient: user control command STREAM_BEGIN ignored
I/CommandsManager: read Command(name='onStatus', transactionId=0, timeStamp=0, streamId=0, data=[AmfString value: onStatus, AmfNumber value: 0.0, AmfNull, AmfObject properties: {AmfString value: level=AmfString value: status, AmfString value: code=AmfString value: NetStream.Publish.Start, AmfString value: description=AmfString value: Publishing myStream., AmfString value: clientid=AmfNumber value: 9.64704158E8}], bodySize=128)
I/CommandsManager: send Data(name='@setDataFrame', data=[AmfString value: onMetaData, AmfEcmaArray length: 3, properties: {AmfString value: duration=AmfNumber value: 0.0, AmfString value: stereo=AmfBoolean value: true, AmfString value: filesize=AmfNumber value: 0.0}], bodySize=85)
E/H264Packet: waiting for a valid sps and pps
I/MainActivity: at tours.flow.hdrstreaming(MainActivity.java:313) . [run] --- Rtmp サーバーへ 接続成功しました。
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
    waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
E/H264Packet: waiting for a valid sps and pps
I/RtmpSender: Skipping iteration, frame null
E/H264Packet: waiting for a valid sps and pps

Thanks!

biviel commented 3 years ago

hi, @pedroSG94 , I adjusted above codes with methods:

protected void onSPSandPPSRtp(ByteBuffer sps, ByteBuffer pps) {
        ByteBuffer newSps = sps.duplicate();
        ByteBuffer newPps = pps.duplicate();
        ByteBuffer newVps = null;
        //rtspclient.setSPSandPPS(newSps, newPps, newVps);
        //rtspclient.connect();

        srsFlvMuxer.setVideoInfo(sps,pps,null);

    }

    @Override
    protected void prepareAudioRtp(boolean isStereo, int sampleRate) {
        //srsFlvMuxer.setIsStereo(isStereo);
        //srsFlvMuxer.setSampleRate(sampleRate);
        srsFlvMuxer.setAudioInfo(sampleRate,isStereo );
    }

and video works fine, but no audio at all at RTMP, do you have any suggestions?

this is form logcat now:

I/MicrophoneManager: Microphone created, 44100hz, Mono
I/AudioEncoder: 1 encoders found
    Encoder OMX.google.aac.encoder
    Encoder selected OMX.google.aac.encoder
I/OMXClient: MuxOMX ctor
I/AudioEncoder: prepared
I/VideoEncoder: 2 encoders found
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
    Unrecognized profile 2130706434 for video/avc
I/VideoEncoder: Encoder OMX.qcom.video.encoder.avc
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
    Unrecognized profile 2130706434 for video/avc
I/VideoEncoder: Color supported: 2141391876
    Color supported: 2130708361
I/VideoEncoder: Encoder selected OMX.qcom.video.encoder.avc
I/OMXClient: MuxOMX ctor
I/VideoEncoder: Prepare video info: SURFACE, 3840x1920
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
    Unrecognized profile 2130706434 for video/avc
I/VideoEncoder: bitrate mode CBR not supported using default mode
E/ACodec: [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -1010
W/ACodec: do not know color format 0x7fa30c04 = 2141391876
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/ACodec: setupAVCEncoderParameters with [profile: Baseline] [level: Level1]
I/ACodec: [OMX.qcom.video.encoder.avc] cannot encode HDR static metadata. Ignoring.
    setupVideoEncoder succeeded
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/VideoEncoder: prepared
I/OpenGlViewBase: Thread started.
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
I/SurfaceManager: GL released
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
I/SurfaceManager: GL initialized
I/VideoEncoder: started
I/MediaCodec: MediaCodec will operate in async mode
I/AudioEncoder: started
I/MediaCodec: MediaCodec will operate in async mode
I/MicrophoneManager: Microphone started
I/Handshake: writing C0
    C0 write successful
    writing C1
    writing time 1636302820 to c1
    writing zero to c1
    writing random to c1
I/Handshake: C1 write successful
    reading S0
I/Handshake: read S0 successful
    reading S1
    read S1 successful
    writing C2
    C2 write successful
    reading S2
    read S2 successful
I/CommandsManager: using default write chunk size 128
I/CommandsManager: send Command(name='connect', transactionId=1, timeStamp=0, streamId=0, data=[AmfString value: connect, AmfNumber value: 1.0, AmfObject properties: {AmfString value: app=AmfString value: livestream, AmfString value: flashVer=AmfString value: FMLE/3.0 (compatible; Lavf57.56.101), AmfString value: swfUrl=AmfString value: , AmfString value: tcUrl=AmfString value: rtmp://feeder.flow.tours:1935/livestream, AmfString value: fpad=AmfBoolean value: false, AmfString value: capabilities=AmfNumber value: 239.0, AmfString value: pageUrl=AmfString value: , AmfString value: objectEncoding=AmfNumber value: 0.0}], bodySize=219)
I/CommandsManager: read WindowAcknowledgementSize(acknowledgementWindowSize=2500000)
    read SetPeerBandwidth(acknowledgementWindowSize=2500000, type=DYNAMIC)
I/CommandsManager: read UserControl(type=STREAM_BEGIN, event=Event(data=0, bufferLength=-1), bodySize=6)
I/RtmpClient: user control command STREAM_BEGIN ignored
I/CommandsManager: read SetChunkSize(chunkSize=4096)
I/RtmpClient: chunk size configured to 4096
I/CommandsManager: read Command(name='_result', transactionId=1, timeStamp=0, streamId=0, data=[AmfString value: _result, AmfNumber value: 1.0, AmfObject properties: {AmfString value: fmsVer=AmfString value: FMS/3,5,7,7009, AmfString value: capabilities=AmfNumber value: 31.0, AmfString value: mode=AmfNumber value: 1.0}, AmfObject properties: {AmfString value: level=AmfString value: status, AmfString value: code=AmfString value: NetConnection.Connect.Success, AmfString value: description=AmfString value: Connection succeeded., AmfString value: data=AmfEcmaArray length: 0, properties: {AmfString value: version=AmfString value: 3,5,7,7009}, AmfString value: clientid=AmfNumber value: 1.149921226E9, AmfString value: objectEncoding=AmfNumber value: 0.0}], bodySize=261)
I/CommandsManager: send Command(name='releaseStream', transactionId=2, timeStamp=0, streamId=0, data=[AmfString value: releaseStream, AmfNumber value: 2.0, AmfNull, AmfString value: myStream?biviel@gmail.com&heaven1], bodySize=62)
    send Command(name='FCPublish', transactionId=3, timeStamp=0, streamId=0, data=[AmfString value: FCPublish, AmfNumber value: 3.0, AmfNull, AmfString value: myStream?biviel@gmail.com&heaven1], bodySize=58)
I/CommandsManager: send Command(name='createStream', transactionId=4, timeStamp=0, streamId=0, data=[AmfString value: createStream, AmfNumber value: 4.0, AmfNull], bodySize=25)
I/RtmpClient: success response received from connect
I/CommandsManager: read Command(name='onFCPublish', transactionId=0, timeStamp=0, streamId=0, data=[AmfString value: onFCPublish, AmfNumber value: 0.0, AmfNull, AmfObject properties: {AmfString value: level=AmfString value: status, AmfString value: code=AmfString value: NetStream.Publish.Start, AmfString value: description=AmfString value: FCPublish to stream myStream., AmfString value: clientid=AmfNumber value: 1.149921226E9}], bodySize=140)
I/RtmpClient: unknown onFCPublish response received from unknown command
I/CommandsManager: read Command(name='_result', transactionId=4, timeStamp=0, streamId=0, data=[AmfString value: _result, AmfNumber value: 4.0, AmfNull, AmfNumber value: 1.0], bodySize=29)
    send Command(name='publish', transactionId=5, timeStamp=0, streamId=1, data=[AmfString value: publish, AmfNumber value: 5.0, AmfNull, AmfString value: myStream?biviel@gmail.com&heaven1, AmfString value: live], bodySize=63)
I/RtmpClient: success response received from createStream
I/CommandsManager: read UserControl(type=STREAM_BEGIN, event=Event(data=1, bufferLength=-1), bodySize=6)
I/RtmpClient: user control command STREAM_BEGIN ignored
D/ACodec: dataspace changed to 0x10c60000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:6(BT2020), T:3(SMPTE_170M))
I/RtmpClient: send sps and pps
I/CommandsManager: read Command(name='onStatus', transactionId=0, timeStamp=0, streamId=0, data=[AmfString value: onStatus, AmfNumber value: 0.0, AmfNull, AmfObject properties: {AmfString value: level=AmfString value: status, AmfString value: code=AmfString value: NetStream.Publish.Start, AmfString value: description=AmfString value: Publishing myStream., AmfString value: clientid=AmfNumber value: 1.149921226E9}], bodySize=128)
I/CommandsManager: send Data(name='@setDataFrame', data=[AmfString value: onMetaData, AmfEcmaArray length: 3, properties: {AmfString value: duration=AmfNumber value: 0.0, AmfString value: stereo=AmfBoolean value: true, AmfString value: filesize=AmfNumber value: 0.0}], bodySize=85)
I/MainActivity: at tours.flow.hdrstreaming(MainActivity.java:313) . [run] --- Rtmp サーバーへ 接続成功しました。
I/RtmpSender: wrote Video packet, size 50
I/RtmpSender: wrote Video packet, size 46177
I/RtmpSender: wrote Video packet, size 11435
I/RtmpSender: wrote Video packet, size 16042
I/RtmpSender: wrote Video packet, size 22862
I/RtmpSender: wrote Video packet, size 322531
I/RtmpSender: wrote Video packet, size 138797
I/RtmpSender: wrote Video packet, size 67355
I/CommandsManager: read Acknowledgement(sequenceNumber=629473)
I/RtmpSender: wrote Video packet, size 333065
I/RtmpSender: wrote Video packet, size 85180
I/RtmpSender: wrote Video packet, size 36259
I/RtmpSender: wrote Video packet, size 56044
I/RtmpSender: wrote Video packet, size 80522
I/RtmpSender: wrote Video packet, size 70285
I/CommandsManager: read Acknowledgement(sequenceNumber=1252528)
I/RtmpSender: wrote Video packet, size 127328
I/RtmpSender: wrote Video packet, size 121292
I/RtmpSender: wrote Video packet, size 70621
I/RtmpSender: wrote Video packet, size 125442
I/RtmpSender: wrote Video packet, size 130862
I/CommandsManager: read Acknowledgement(sequenceNumber=1877427)
I/RtmpSender: wrote Video packet, size 123269
I/RtmpSender: wrote Video packet, size 99224
I/RtmpSender: wrote Video packet, size 122075
I/RtmpSender: wrote Video packet, size 120491
I/RtmpSender: wrote Video packet, size 171674
I/CommandsManager: read Acknowledgement(sequenceNumber=2503213)

Thanks a lot!

pedroSG94 commented 3 years ago

First of all check that you send audio to rtmpClient properly.

biviel commented 3 years ago

hi, @pedroSG94 , I had to revert back to version 1.8.0. It worked! Now I'm able to run there both RTMP and RTSP in 4K resolution, in h264 encoding.

I will do my best to go further with the update, but now at 1.8.0 it works quite good at newer versions I have live stream stucked a lot for some reason and RTSP still hangs. But would like to try h265 encoded streaming but it fails.

Will try to update further. But, when I try to set h265 (HEVC) encoding when I start stream it stops instantly:

        startStreamRtp(url);
        videoEncoder.setType(CodecUtil.H265_MIME);
        videoEncoder.start(true);
        audioEncoder.start(true);
        microphoneManager.start();

this is the logcat:

I/MicrophoneManager: Microphone created, 44100hz, Mono
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
    Unrecognized profile 2130706434 for video/avc
W/Utils: could not parse long range '175-174'
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
    Unrecognized profile 2130706434 for video/avc
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
    Unrecognized profile 2130706434 for video/avc
W/VideoCapabilities: Unrecognized profile/level 0/3 for video/mpeg2
W/VideoCapabilities: Unrecognized profile/level 0/3 for video/mpeg2
I/VideoCapabilities: Unsupported profile 4 for video/mp4v-es
I/OMXClient: MuxOMX ctor
I/AudioEncoder: prepared
I/VideoEncoder: VideoEncoder OMX.qcom.video.encoder.avc
W/VideoCapabilities: Unrecognized profile 2130706433 for video/avc
    Unrecognized profile 2130706434 for video/avc
I/VideoEncoder: Color supported: 2141391876
I/VideoEncoder: Color supported: 2130708361
I/OMXClient: MuxOMX ctor
I/VideoEncoder: Prepare video info: SURFACE, 3840x1920
E/ACodec: [OMX.qcom.video.encoder.avc] storeMetaDataInBuffers (output) failed w/ err -1010
W/ACodec: do not know color format 0x7fa30c04 = 2141391876
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/ACodec: setupAVCEncoderParameters with [profile: Baseline] [level: Level1]
I/ACodec: [OMX.qcom.video.encoder.avc] cannot encode HDR static metadata. Ignoring.
    setupVideoEncoder succeeded
W/ACodec: do not know color format 0x7f000789 = 2130708361
I/VideoEncoder: prepared
I/OpenGlViewBase: Thread started.
I/MediaCodec: MediaCodec will operate in async mode
I/VideoEncoder: started
I/AudioEncoder: started
I/MicrophoneManager: Microphone started
D/ACodec: dataspace changed to 0x10c60000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:6(BT2020), T:3(SMPTE_170M))
I/MicrophoneManager: Microphone stopped
I/VideoEncoder: stopped
I/AudioEncoder: stopped

I see in log that I/VideoEncoder: VideoEncoder OMX.qcom.video.encoder.avc is used.

On this device there is a snapdragon gpu/chip that should be able to do h265 hardware encoding. Is there a way to initiate, some change required?

Thanks!

pedroSG94 commented 3 years ago

You need call setType before call prepareVideo: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtplibrary/src/main/java/com/pedro/rtplibrary/rtsp/RtspCamera1.java#L135

Also, remember change onSPSandPPSRtp method to provide vps (if null == h264, if not null h265) to rtsp lib

biviel commented 3 years ago

hi, @pedroSG94 , It took some time, but I'm running now latest version of library 2.1.3. Will take some time to collect related issues, will come back to you ASAP! Thanks!

biviel commented 3 years ago

hi, @pedroSG94 , I will try to simplify my post if possible to give you a short update about where am I now, but still happy to run latest version on this device. Both RTMP and RTSP works more or less with some issues.

  1. FPS - I noticed that for some reason FPS is not good enough, this is the same for both RTMP and RTSP, like not enough GPU/CPU power to render and encode, packetize, etc. I was looking at issues list here to find a potential solution and in theory using LightOpenGlView instead of OpenGlView may help. I tried to simply repalce it in my rtmpextend and rtspextend classes bot got error messages about surface. I couldn't make it work yet, but trying.

  2. RTSP audio issues - when I try to stream in RTSP the audio is delayed compared to video and the voice samplerate is bad for some reason... It's like the speach slowed down, liek samplerate is decreased for some reason. I see no error messages at all. And also video stopped after some time and like it waits until the audio gets to the same position than play resumes again, but voice is bad, slowed down , decreased samplerate and then video stops again after some time and this repeats.

  3. When I set videoEncoder.setType(CodecUtil.H265_MIME); at RTSP I would expect that to stream in HEVC it will require less bandwidth for the same quality... But according to my tests it doesn't work liek that. It requires the same netwrok upload bandwidth and quality is not better seems. Do you have any other configuration setting that I should do here?

  4. Audio - if set to stereo, there I'm facing with issues, sync issues with audio and video and strange voice, like the samplerate is not inlice between audioencoder and microphone.

        microphoneManager.createMicrophone(44100, false, false, false);
        return this.audioEncoder.prepareAudioEncoder(128 * 1024, 44100, false,0);
    }

    Regarding audio stereo issue, I must say that this camera does support spatial audio, but via camera API I was turning of spatial audio with a command, but stereo in earlier version of library was working fine. Also I'm using that branch you modified recently for Rattling audio...

this works only for some reason I can't make stereo work.

Your notes are very welcome, thanks in advance!

pedroSG94 commented 3 years ago

Hello,

1 - I recommend you try to detect the bottleneck that should be in one of this: Camera producing frames too slow, VideoEncoder, RtmpClient packetization and send. For it, I recommend you first comment RtmpClient.sendVideo method and check fps produced on VideoEncoder callback to know if it is close to desired, after that, remove VideoEncoder set frame method and check FPS provided by camera (or check if preview is smooth if you are using OpenGlView). If both produce a good rate of FPS the problem could be RtmpClient. In this case check that you never receive a frame drop (in logcat) because in this case it is normally a bandwidth problem.

2 - In this case try to not use the branch that change ts (branch of Rattling audio issue) because this could be the reason. Also, check that presentationTimeUs provided by MediaCodec.BufferInfo to know if the delay problem is in Encoders.

3 - I tested in 3 devices this feature and in all devices I could reduce bitrate around 30-40% without lose quality. Also, this depend of codec implementation in your device. I have nothing to do in this problem my only recommendation is check (in logcat) that you are not droping frames, check if you have other H265 codec and try use that or use differents parameters to test.

4 - If you set to RtmpClient the same parameters all should be working fine. If you are using my MicrophoneManager try this:

        microphoneManager.createMicrophone(44100, true, false, false);
        return this.audioEncoder.prepareAudioEncoder(128 * 1024, 44100, true, microphoneManager.getMaxInputSize());
}

This last code could be really important to fix all your problems with audio. Maybe you should start with that.

biviel commented 3 years ago

Thanks for your fast replay!

2 - this issue with RTSP is there all the time. Like the frequency is not right for some reason.

1 - it may be that this camera does in camere stitching of dula fisheye formatted video to equirectangular and using it's GPU, so it requires lot of power. But I thought that using LightOpenGLView may help a bit...

Will try to proceed accordingly and let you know. I feel that issue 4 and 2 may be related...

biviel commented 3 years ago

@pedroSG94 , microphoneManager.getMaxInputSize() helped a lot, thanks!

I'm focusing now on issue 2 - RTSP audio issue. I modified my rtspextend class to print time:

protected void getAacDataRtp(ByteBuffer aacBuffer, MediaCodec.BufferInfo info) {
        //srsFlvMuxer.sendAudio(aacBuffer, info);
        Log.e("Audio timestamp", "" + info.presentationTimeUs);
        //Audio
        rtspclient.sendAudio(aacBuffer,info);
    }

    public void getVideoData(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
        //*srsFlvMuxer.sendVideo(h264Buffer, info);
        //Video
        Log.e("Video timestamp", "" + info.presentationTimeUs);
        rtspclient.sendVideo(h264Buffer,info);
    }

and in logcat I can see strange timestamps between audio and video:


I/OpenGlViewBase: Thread started.
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
I/SurfaceManager: GL released
E/SurfaceManager: GL already released
I/SurfaceManager: GL initialized
I/SurfaceManager: GL initialized
I/d$a: at tours.flow.hdrstreaming.a(:210) . [a] --- Record log - record state: STARTED
I/VideoEncoder: started
I/MediaCodec: MediaCodec will operate in async mode
I/RtspClient: waiting for sps and pps
I/AudioEncoder: started
I/MediaCodec: MediaCodec will operate in async mode
I/MicrophoneManager: Microphone started
E/Audio timestamp: 0
E/Audio timestamp: 137377
E/Audio timestamp: 160596
E/Audio timestamp: 183225
E/Audio timestamp: 206444
E/Audio timestamp: 229403
E/Audio timestamp: 252622
E/Audio timestamp: 275648
E/Audio timestamp: 298611
E/Audio timestamp: 321830
    344786
E/Audio timestamp: 368005
E/Audio timestamp: 391009
E/Audio timestamp: 414228
E/Audio timestamp: 437174
E/Audio timestamp: 460137
E/Audio timestamp: 483356
E/Audio timestamp: 506702
E/Audio timestamp: 529921
    552911
E/Audio timestamp: 576130
E/Audio timestamp: 598813
E/Audio timestamp: 621651
E/Audio timestamp: 644870
E/Audio timestamp: 667814
E/Audio timestamp: 691033
    714152
E/Audio timestamp: 737371
E/Audio timestamp: 760251
E/Audio timestamp: 783195
E/Audio timestamp: 806414
E/Audio timestamp: 829373
E/Audio timestamp: 852592
E/Audio timestamp: 875659
    898565
E/Audio timestamp: 921784
    944810
E/Audio timestamp: 968029
    991008
D/ACodec: dataspace changed to 0x10c60000 (R:2(Limited), P:3(BT601_6_625), M:3(BT601_6), T:3(SMPTE170M)) (R:2(Limited), S:6(BT2020), T:3(SMPTE_170M))
E/Audio timestamp: 1014227
I/RtspClient: send sps and pps
I/RtspClient: send sps and pps
E/Video timestamp: 1168802
E/Audio timestamp: 1037240
E/Audio timestamp: 1060793
E/Video timestamp: 1179931
I/CommandsManager: OPTIONS rtsp://feeder.flow.tours:1935/livestream/myStream RTSP/1.0
    CSeq: 1
    User-Agent: com.pedro.rtsp 2.1.3
E/Audio timestamp: 1084012
    1106547
I/CommandsManager: RTSP/1.0 200 OK
    CSeq: 1
    Server: Wowza Streaming Engine 4.8.13+1 build20210527172944
    Cache-Control: no-cache
    Public: DESCRIBE, SETUP, TEARDOWN, PLAY, PAUSE, OPTIONS, ANNOUNCE, RECORD, GET_PARAMETER
    Supported: play.basic, con.persistent
E/Video timestamp: 1255736
I/CommandsManager: ANNOUNCE rtsp://feeder.flow.tours:1935/livestream/myStream RTSP/1.0
    Content-Type: application/sdp
    CSeq: 2
    User-Agent: com.pedro.rtsp 2.1.3
    Content-Length: 514

    v=0
    o=- 0 0 IN IP4 127.0.0.1
    s=Unnamed
    i=N/A
    c=IN IP4 feeder.flow.tours
    t=0 0
    a=recvonly
    m=video 0 RTP/AVP 96
    a=rtpmap:96 H265/90000
    a=fmtp:96 packetization-mode=1; sprop-sps=QgEBAWAAAAMAsAAAAwAAAwB4oAHgIAIgfE5a7kyS7lIKDAwF2hQl; sprop-pps=RAHA8YAEIA==; sprop-vps=QAEMAf//AWAAAAMAsAAAAwAAAwB4rFk=
    a=control:streamid=0
    m=audio 0 RTP/AVP 97
    a=rtpmap:97 MPEG4-GENERIC/32000/2
    a=fmtp:97 profile-level-id=1; mode=AAC-hbr; config=1290; sizelength=13; indexlength=3; indexdeltalength=3
    a=control:streamid=1
E/Audio timestamp: 1129766
    1153264
E/Video timestamp: 1301054
I/CommandsManager: RTSP/1.0 200 OK
    CSeq: 2
    Server: Wowza Streaming Engine 4.8.13+1 build20210527172944
    Cache-Control: no-cache
    Session: 1507221400;timeout=60
I/RtspClient: announce success
E/Audio timestamp: 1176483
I/CommandsManager: SETUP rtsp://feeder.flow.tours:1935/livestream/myStream/streamid=0 RTSP/1.0
    Transport: RTP/AVP/TCP;unicast;interleaved=0-1;mode=record
    CSeq: 3
    User-Agent: com.pedro.rtsp 2.1.3
    Session: 1507221400
E/Video timestamp: 1312296
I/CommandsManager: RTSP/1.0 200 OK
    CSeq: 3
    Server: Wowza Streaming Engine 4.8.13+1 build20210527172944
    Cache-Control: no-cache
    Expires: Sun, 14 Nov 2021 14:44:21 UTC
    Transport: RTP/AVP/TCP;unicast;interleaved=0-1;mode=record
    Date: Sun, 14 Nov 2021 14:44:21 UTC
    Session: 1507221400;timeout=60
I/CommandsManager: SETUP rtsp://feeder.flow.tours:1935/livestream/myStream/streamid=1 RTSP/1.0
    Transport: RTP/AVP/TCP;unicast;interleaved=2-3;mode=record
    CSeq: 4
    User-Agent: com.pedro.rtsp 2.1.3
    Session: 1507221400
E/Audio timestamp: 1198861
E/Audio timestamp: 1233120
E/Video timestamp: 1361687
I/CommandsManager: RTSP/1.0 200 OK
    CSeq: 4
    Server: Wowza Streaming Engine 4.8.13+1 build20210527172944
    Cache-Control: no-cache
    Expires: Sun, 14 Nov 2021 14:44:21 UTC
    Transport: RTP/AVP/TCP;unicast;interleaved=2-3;mode=record
    Date: Sun, 14 Nov 2021 14:44:21 UTC
    Session: 1507221400;timeout=60
I/CommandsManager: RECORD rtsp://feeder.flow.tours:1935/livestream/myStream RTSP/1.0
    Range: npt=0.000-
    CSeq: 5
    User-Agent: com.pedro.rtsp 2.1.3
    Session: 1507221400
E/Audio timestamp: 1256339
    1277907
I/CommandsManager: RTSP/1.0 200 OK
    CSeq: 5
    Server: Wowza Streaming Engine 4.8.13+1 build20210527172944
    Cache-Control: no-cache
    Range: npt=now-
    Session: 1507221400;timeout=60
E/Audio timestamp: 1301126
E/Audio timestamp: 1324237
I/BaseRtpSocket: wrote packet: Audio, size: 379
I/BaseSenderReport: wrote report: Audio, packets: 1, octet: 375
I/BaseRtpSocket: wrote packet: Audio, size: 379
E/Video timestamp: 1432545
E/Video timestamp: 1456651
E/Audio timestamp: 1347456
I/BaseRtpSocket: wrote packet: Audio, size: 375
E/Video timestamp: 1494563
E/Audio timestamp: 1370827
I/BaseRtpSocket: wrote packet: Audio, size: 385
E/Audio timestamp: 1394307
I/BaseRtpSocket: wrote packet: Audio, size: 396
E/Audio timestamp: 1417526
E/Video timestamp: 1548842
E/Audio timestamp: 1439851
I/BaseRtpSocket: wrote packet: Audio, size: 391
    wrote packet: Audio, size: 392
E/Audio timestamp: 1463070
I/BaseRtpSocket: wrote packet: Audio, size: 394
E/Video timestamp: 1597512
E/Audio timestamp: 1486006
E/Audio timestamp: 1508849
I/BaseRtpSocket: wrote packet: Audio, size: 413
    wrote packet: Audio, size: 405
E/Video timestamp: 1636003
E/Audio timestamp: 1532068
I/BaseRtpSocket: wrote packet: Audio, size: 392
E/Audio timestamp: 1555184
I/BaseRtpSocket: wrote packet: Audio, size: 401
E/Video timestamp: 1687049
E/Audio timestamp: 1578403
E/Audio timestamp: 1601563
I/BaseRtpSocket: wrote packet: Audio, size: 394
    wrote packet: Audio, size: 505
E/Video timestamp: 1739016
E/Audio timestamp: 1624782
I/BaseRtpSocket: wrote packet: Audio, size: 392
E/Audio timestamp: 1647656
I/BaseRtpSocket: wrote packet: Audio, size: 392
E/Audio timestamp: 1670507
I/BaseRtpSocket: wrote packet: Audio, size: 382
E/Video timestamp: 1797215
E/Audio timestamp: 1693726
I/BaseRtpSocket: wrote packet: Audio, size: 385
E/Audio timestamp: 1716788
I/BaseRtpSocket: wrote packet: Audio, size: 394
E/Video timestamp: 1840101
E/Audio timestamp: 1740007
    1762599
I/BaseRtpSocket: wrote packet: Audio, size: 398
I/BaseRtpSocket: wrote packet: Audio, size: 495
E/Video timestamp: 1898040
E/Audio timestamp: 1785818
I/BaseRtpSocket: wrote packet: Audio, size: 395
E/Video timestamp: 1944796
E/Audio timestamp: 1809385
I/BaseRtpSocket: wrote packet: Audio, size: 373
E/Audio timestamp: 1831983
I/BaseRtpSocket: wrote packet: Audio, size: 386
E/Video timestamp: 1987252
E/Audio timestamp: 1855202
E/Audio timestamp: 1878143
I/BaseRtpSocket: wrote packet: Audio, size: 377
    wrote packet: Audio, size: 398
E/Audio timestamp: 1901362
I/BaseRtpSocket: wrote packet: Audio, size: 391
E/Audio timestamp: 1925469
I/BaseRtpSocket: wrote packet: Audio, size: 393
E/Video timestamp: 2050140
E/Audio timestamp: 1948688
I/BaseRtpSocket: wrote packet: Audio, size: 393
E/Video timestamp: 2096114
E/Audio timestamp: 1970699
E/Audio timestamp: 1993374
I/BaseRtpSocket: wrote packet: Audio, size: 398
I/BaseRtpSocket: wrote packet: Audio, size: 397
E/Video timestamp: 2139340
E/Audio timestamp: 2016593
I/BaseRtpSocket: wrote packet: Audio, size: 394
E/Audio timestamp: 2039898
I/BaseRtpSocket: wrote packet: Audio, size: 395
E/Audio timestamp: 2063117
I/BaseRtpSocket: wrote packet: Audio, size: 392
E/Video timestamp: 2198170
I/BaseRtpSocket: wrote packet: Video, size: 68
I/BaseSenderReport: wrote report: Video, packets: 1, octet: 64
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
E/Audio timestamp: 2085876
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1039
    wrote packet: Audio, size: 416
E/Audio timestamp: 2108895
I/BaseRtpSocket: wrote packet: Audio, size: 401
E/Video timestamp: 2244368
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
E/Audio timestamp: 2132114
    2154830
E/Video timestamp: 2297482
E/Audio timestamp: 2178049
E/Audio timestamp: 2205891
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 984
    wrote packet: Audio, size: 444
I/BaseRtpSocket: wrote packet: Audio, size: 398
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1374
    wrote packet: Audio, size: 378
    wrote packet: Audio, size: 386
E/Video timestamp: 2350207
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
I/BaseRtpSocket: wrote packet: Video, size: 1476
    wrote packet: Video, size: 1476
    wrote packet: Video, size: 1370
E/Audio timestamp: 2229110
I/BaseRtpSocket: wrote packet: Audio, size: 386
E/Video timestamp: 2389862

I'm not sure how to proceed further. Also I think that it stopps video too after soem time because of audio sync issue.

Also I try to record this in my RtspExtendBase class by adding recordcontroller to it by modifying startStream method:

public void startStream(String url) {
        if (openGlView != null && Build.VERSION.SDK_INT >= 18) {
            openGlView.init();
            openGlView.setFps(20);
            openGlView.setEncoderSize(3840, 2160);
            openGlView.start();
            openGlView.addMediaCodecSurface(videoEncoder.getInputSurface());

        }
        //uxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
        File file = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM);
        try {
            recordController.startRecord(file.getPath() + "/temp.mp4", new RecordController.Listener() {
                @Override
                public void onStatusChange(RecordController.Status status) {
                    Timber.i("Record log - record state: " + status.name());
                }
            });
        } catch (IOException e) {
            e.printStackTrace();
        }

        startStreamRtp(url);
        videoEncoder.start();
        audioEncoder.start();
        microphoneManager.start();
        videoEncoder.requestKeyframe();
        streaming = true;
    }

also I added recordController.stopRecord(); at stopStream() method in the same class, but the filesize is 0. I think I miss some "things" there to be able to record.

Any thoughts around RTSP audio issue and/or recording?

pedroSG94 commented 3 years ago

About audio timestamp: I think you can try modify timestamp to sync with video using something like this but applied to presentationTimeUs value (basically save video presentationTimeUs in a value and increment audio presentationTimeUs depend of the difference): https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/commit/519c7d416d027aa8fcb76ec9fe5c0fbf968e0699

About record I think that you miss add tracks to recordController: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtplibrary/src/main/java/com/pedro/rtplibrary/base/Camera1Base.java#L915 https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtplibrary/src/main/java/com/pedro/rtplibrary/base/Camera1Base.java#L920 Remember that stop record is totally necessary to get a readable file.

biviel commented 3 years ago

Just a note its not only time/sync issue, but sample rate... When I play its playing slowed down the audio... Does that make any difference? Thnaks!

pedroSG94 commented 3 years ago

It is really weird. I'm not sure about the reason. Did you try using another samplerate (48000 or 32000) or player?

biviel commented 3 years ago

yes. I tried... Also I used another deivce a mobile phone in RTSP it was working fine to the same server and player. This may be device specific, but also means sohuld be able to make it work through library. Can I get somewhere acutal info about real samplerate that is used to compare with what is set there? I will try to record for you if I manage.

pedroSG94 commented 3 years ago

Do you have more than an audio encoder AAC in that device? You can check it using this:

    for (String codec: CodecUtil.showAllCodecsInfo()) {
      Log.e("Codec", codec);
    }

Show me the result and we can try use other codec to check if it is a codec problem.

biviel commented 3 years ago

hi @pedroSG94 , I tried 32000 for samplerate and it worked! I will spend some more time testing to confirm... For RTMP worked 48000 22050 , etc. but for RTSP only 32000 , strange. Is there a codec that can support spatial audio too? :)

Will come back to you, but with RTSP for some reason camera doesnt heat so much and can stream longer in higher quality. Will do some teste, will do my best to share via youtube, etc. my platform is able to use HDR mode for live streaming from this cam, consuming h265 encoded.... Will try recording a bit later to lcoal storage and let you know my findings.

Thanks for your help and support! Will still come back to you after I double check some of the issues discussed here.

pedroSG94 commented 3 years ago

HDR or spatial audio are not supported in this library. Also, the only protocol that is possible to develop that feature is RTSP

biviel commented 3 years ago

thanks! I will look into spatial audio once I get there, It would be a really nice feature for sure! Right now the only important piece here is that I fail to record video.

I use RTSP now and it streams, but for some reason the file created is empty...

I try to start here the recording:

   public void startStream(String url) {
        if (openGlView != null && Build.VERSION.SDK_INT >= 18) {
            openGlView.init();
            openGlView.setFps(20);
            openGlView.setEncoderSize(3840, 2160);
            openGlView.start();
            openGlView.addMediaCodecSurface(videoEncoder.getInputSurface());

        }
        startStreamRtp(url);
        videoEncoder.start();
        audioEncoder.start();
        microphoneManager.start();
        //videoEncoder.requestKeyframe();
        streaming = true;
        //uxer = new MediaMuxer(path, MediaMuxer.OutputFormat.MUXER_OUTPUT_MPEG_4);
        File file = Environment.getExternalStoragePublicDirectory(Environment.DIRECTORY_DCIM);
        try {
            recordController.startRecord(file.getPath() + "/temp.mp4", new RecordController.Listener() {
                @Override
                public void onStatusChange(RecordController.Status status) {
                    Timber.i("Record log - record state: " + status.name());
                    videoEncoder.requestKeyframe();
                }
            });
        } catch (IOException e) {
            e.printStackTrace();
        }
    }

When I start, in logcat I can see "Record log - record state: STARTED" and when I stop I can see STOPPED, but still no data in the file, it's 0 byte.

I stop when stream is stopped:

    public void stopStream() {
        microphoneManager.stop();
        this.stopStreamRtp();
        this.videoEncoder.stop();
        this.audioEncoder.stop();
        if (this.openGlView != null && Build.VERSION.SDK_INT >= 18) {
            this.openGlView.removeMediaCodecSurface();
            this.openGlView.stop();
        }
        recordController.stopRecord();
        this.streaming = false;
    }

also like you mentioned I'm adding audio video track , onformatvideo, onformataudio, I set. I do record vide and audio too:

    public void getVideoData(ByteBuffer h264Buffer, MediaCodec.BufferInfo info) {
            recordController.recordVideo(h264Buffer, info);
        if (streaming) getH264DataRtp(h264Buffer, info);
    }

 public void getAacData(ByteBuffer aacBuffer, MediaCodec.BufferInfo info) {
        if (Build.VERSION.SDK_INT >= 18 && this.recording && this.audioTrack != -1 && this.canRecord) {
            this.mediaMuxer.writeSampleData(this.audioTrack, aacBuffer, info);
        }
        this.getAacDataRtp(aacBuffer, info);
        recordController.recordAudio(aacBuffer, info);
    }

thanks again!

biviel commented 3 years ago

hi, managed to make it record! I was missing something before.

A question, if I want to record only, without streaming in rtsp, I just have to omit/remove rtspclient.connect(url); , is that right?

Thanks!

P.S. Almost there...

biviel commented 3 years ago

@pedroSG94 , when the encoder is set to H265, saved video seems corrupt. Is it expected to save in HEVC format the recorded video or not? When I look at file it's there, but seems not playable.

Thanks!

pedroSG94 commented 3 years ago

A question, if I want to record only, without streaming in rtsp, I just have to omit/remove rtspclient.connect(url); , is that right?

Yes, that should work.

when the encoder is set to H265, saved video seems corrupt. Is it expected to save in HEVC format the recorded video or not?

If you read into RecordController class you can find that a keyframe is necessary to start record the file: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtplibrary/src/main/java/com/pedro/rtplibrary/util/RecordController.java#L174 To know if that line is called you have the recordlistener where you should get status recording: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtplibrary/src/main/java/com/pedro/rtplibrary/util/RecordController.java#L164 Maybe, you are not producing a keyframe after startRecord was called. I recommend you move RecordController.start above of VideoEncoder.start or call requestKeyFrame of VideoEncoder to force produce a keyframe: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtplibrary/src/main/java/com/pedro/rtplibrary/base/Camera1Base.java#L608

biviel commented 3 years ago

hi!

When I tried to save and encoding was h264, stream was recorded fine, but when I switched to h265 it seemed corrupt, wasn't able to play and videoinfo was missing, not right when I checked in windows.

1 - So this videoencoder.requestKeyFrame() should be executed after recording starts? I will check. But isn't it also required for h264?

2 - Another question now about OpenGL filters. WOuld it be possible to create a filter that transforms dual fisheye format to equirectangular using hardware accelerated GPU operations of openGL?

Here is the format visible: https://blog.kuula.co/fisheye-equirectangular

The caemra Im using seems doent work stitching good enough... its too slow and thats causing FPS to drop down. Without stitching (dual fisheye to equirectangular tranformation) all works fine. Wondering if doing it via your library by a filter could be faster?

Thanks, LAszlo

pedroSG94 commented 3 years ago

1 - Yes, it is required in both cases but maybe your H265 encoder only generate a keyframe on VideoEncoder start and you call startRecord after that, producing that you never start the record (keyframe is skipped because you are not recording yet). For H264 your encoder should be generating keyframes each iFrameInterval seconds set in prepareVideo method and it seems not working properly on H265. If you check RecordController.Listener callback and status Recording is never called. It is the case and force requestKeyFrame could help you. In my library, I always call first start record and after that start encoders or requestKeyframe to generate a keyframe while recording: https://github.com/pedroSG94/rtmp-rtsp-stream-client-java/blob/master/rtplibrary/src/main/java/com/pedro/rtplibrary/base/Camera1Base.java#L367

2 - Yes, it is possible but I can't comfirm you that the performance will be better or not. Depend of your algoritihm and your device GPU performance. Also, you can consider reduce resolution to improve FPS

pedroSG94 commented 9 months ago

Closing as inactive