Consti10 / LiveVideo10ms

Real time video decoding on android
GNU Lesser General Public License v3.0
77 stars 23 forks source link

Feature request: RTSP protocol #2

Closed marclava closed 4 years ago

marclava commented 4 years ago

Playing a RTP stream work well, but I need to play RTSP streams.

Consti10 commented 4 years ago

Hi, You can play an RTSP stream with ffmpeg but note that latency using ffmpeg to receive data is generally higher. Example: a) Go to settings -> VS_SOURCE -> VIA_FFMPEG_URL b) change VS_FFMPEG_URL to the proper URL For example, I tested it with the following app as Server https://play.google.com/store/apps/details?id=veg.mediacapture.sdk.test.server&hl=en And an url like rtps://192.168.1.1:8554/ch0

marclava commented 4 years ago

It works, but it's too slow, sometimes nothing shows; with RTP/UDP, it's instantaneous.

Consti10 commented 4 years ago

RTSP is quite complex . Thats why I am using ffmpeg for rtsp. If you want to reduce latency you have to use RTP over UDP.

marclava commented 4 years ago

Also, could it be that the type of H264 encoding matters? I was mostly successful at streaming H264 with a baseline profile.

Consti10 commented 4 years ago

Yes depending on both the encoder and decoder. The example app provides you with detailed decoding information.

marclava commented 4 years ago

There's a LowLagDecoder class; could there be also a LowLagEncoder class, to stream the camera (or the screen) with RTP? So that two Android devices could stream to one another at low latency?

Consti10 commented 4 years ago

I experimented a bit with Android as live stream camera here: https://github.com/Consti10/LiveVideoStreamProducer

Results were mixed. Some devices performed really well (low latency) where others were terrible.

marclava commented 4 years ago

I may not understand how to use it; do I have to specify the destination ip address? The ip address field is filled with my router address (192.168.1.1), so I changed it to the address of the phone with the player. When I start the video streaming activity, the screen stays white, and I can't see the stream in the player. In the phone log, it says that the camera was opened, but there's also this error: 07-08 17:36:36.064 E/Layer (619): [Surface(name=AppWindowToken{f8f9301 token=Token{5c790e8 ActivityRecord{de0ca0b u0 constantin.livevideostreamproducer/constantin.testlivevideostreamproducer.AVideoStream t5183}}})/@0xe43012c - animation-leash#0] No local sync point found

marclava commented 4 years ago

Maybe a better log; the camera stays on for a minute, then close: 07-08 17:59:58.860 1676 4523 I ActivityTaskManager: START u0 {cmp=constantin.livevideostreamproducer/constantin.testlivevideostreamproducer.AVideoStream} from uid 10220 07-08 17:59:59.069 687 1883 I CameraService: CameraService::connect call (PID -1 "constantin.livevideostreamproducer", camera ID 0) for HAL version default and Camera API version 2 07-08 17:59:59.070 687 1883 I Camera2ClientBase: Camera 0: Opened. Client: constantin.livevideostreamproducer (PID 7561, UID 10220) 07-08 18:00:33.711 687 1883 I Camera2ClientBase: Closed Camera 0. Client was: constantin.livevideostreamproducer (PID 7561, UID 10220)

Consti10 commented 4 years ago

Hello, There was a bug in the Live stream app that prevented the live stream from being generated. The latest commit should work. To make it work, both devices need to be in the same network (e.g. connected via hotspot) and you need to put the ip of the receiving device into the 'edit text' element. Sometimes the 'autofill'might do that for you. Then select udp and raw on the receiving phone.

This process is annoying but I had no time yet to implement a simple protocoll for reliably exchanging the ip. Contributions welcome

Consti10 commented 4 years ago

I also don't know if it works with a router in between - i only tested with both phones connected directly via hotspot.

Consti10 commented 4 years ago

Maybe a better log; the camera stays on for a minute, then close: 07-08 17:59:58.860 1676 4523 I ActivityTaskManager: START u0 {cmp=constantin.livevideostreamproducer/constantin.testlivevideostreamproducer.AVideoStream} from uid 10220 07-08 17:59:59.069 687 1883 I CameraService: CameraService::connect call (PID -1 "constantin.livevideostreamproducer", camera ID 0) for HAL version default and Camera API version 2 07-08 17:59:59.070 687 1883 I Camera2ClientBase: Camera 0: Opened. Client: constantin.livevideostreamproducer (PID 7561, UID 10220) 07-08 18:00:33.711 687 1883 I Camera2ClientBase: Closed Camera 0. Client was: constantin.livevideostreamproducer (PID 7561, UID 10220)

What device do you use ? I tested with Pixel 3 and ZTE axon 7

marclava commented 4 years ago

I use a Xiaomi Mi A2 Lite device with LineageOS 17.0 (Android 10). An even better log:

07-09 07:43:54.369  1676  1900 E LightsService: Light requested not available on this device. 2
07-09 07:43:54.458  1676  4900 I ActivityTaskManager: START u0 {cmp=constantin.livevideostreamproducer/constantin.testlivevideostreamproducer.AVideoStream} from uid 10220
07-09 07:43:54.491 16521 16521 W ActivityThread: handleWindowVisibility: no activity for token android.os.BinderProxy@512cdc9
07-09 07:43:54.554 16521 16563 I OMXClient: IOmx service obtained
07-09 07:43:54.555   708  2133 I OMXMaster: makeComponentInstance(OMX.qcom.video.encoder.avc) in android.hardwar process
07-09 07:43:54.560   708  2133 I OMX-VENC: Video encode perflock acquired,handle=106
07-09 07:43:54.671   708  2133 I OMX-VENC: Component_init : OMX.qcom.video.encoder.avc : return = 0x0
07-09 07:43:54.678   708  4745 E OMXNodeInstance: setParameter(0xf4fb0b04:qcom.encoder.avc, OMX.google.android.index.allocateNativeHandle(0x7f00005d): Input:0 en=0) ERROR: UnsupportedSetting(0x80001019)
07-09 07:43:54.678   708  2133 E OMXNodeInstance: setParameter(0xf4fb0b04:qcom.encoder.avc, OMX.google.android.index.allocateNativeHandle(0x7f00005d): Output:1 en=0) ERROR: UnsupportedSetting(0x80001019)
07-09 07:43:54.678   708  2133 W OMXNodeInstance: [0xf4fb0b04:qcom.encoder.avc] component does not support metadata mode; using fallback
07-09 07:43:54.679 16521 16563 W OMXUtils: do not know color format 0x7fa30c04 = 2141391876
07-09 07:43:54.680 16521 16563 W OMXUtils: do not know color format 0x7f000789 = 2130708361
07-09 07:43:54.700 16521 16563 I ACodec  : setupAVCEncoderParameters with [profile: Baseline] [level: Level31]
07-09 07:43:54.706 16521 16563 I ACodec  : [OMX.qcom.video.encoder.avc] cannot encode HDR static metadata. Ignoring.
07-09 07:43:54.706 16521 16563 I ACodec  : setupVideoEncoder succeeded
07-09 07:43:54.709 16521 16563 W OMXUtils: do not know color format 0x7f000789 = 2130708361
07-09 07:43:54.710   708  4914 E OMXNodeInstance: getConfig(0xf4fb0b04:qcom.encoder.avc, ConfigLatency(0x6f800005)) ERROR: UnsupportedIndex(0x8000101a)
07-09 07:43:54.718   708  2133 E OMXNodeInstance: getConfig(0xf4fb0b04:qcom.encoder.avc, ??(0x7f000062)) ERROR: UnsupportedSetting(0x80001019)
07-09 07:43:54.721   708   908 E OMXNodeInstance: getParameter(0xf4fb0b04:qcom.encoder.avc, ParamConsumerUsageBits(0x6f800004)) ERROR: UnsupportedIndex(0x8000101a)
07-09 07:43:54.724   708   908 D GraphicBufferSource: setting dataspace: 0x104, acquired=0
07-09 07:43:54.725   708  2133 E OMXNodeInstance: getParameter(0xf4fb0b04:qcom.encoder.avc, ParamConsumerUsageBits(0x6f800004)) ERROR: UnsupportedIndex(0x8000101a)
07-09 07:43:54.725   708  4914 D GraphicBufferSource: requesting color aspects (R:2(Limited), P:1(BT709_5), M:1(BT709_5), T:3(SMPTE170M))
07-09 07:43:54.781 16521 16521 D AVideoStream: Opening camera
07-09 07:43:54.782 16521 16521 I CameraManagerGlobal: Connecting to camera service
07-09 07:43:54.796 16521 16521 D AVideoStream: Available fps range(s):[[15, 15], [20, 20], [7, 24], [24, 24], [7, 30], [30, 30]]
07-09 07:43:54.801   687   687 I CameraService: CameraService::connect call (PID -1 "constantin.livevideostreamproducer", camera ID 0) for HAL version default and Camera API version 2
07-09 07:43:54.802   687   687 I Camera2ClientBase: Camera 0: Opened. Client: constantin.livevideostreamproducer (PID 16521, UID 10220)
07-09 07:43:54.802   687   687 I CameraDeviceClient: CameraDeviceClient 0: Opened
07-09 07:43:54.804   687   687 I CameraService: onTorchStatusChangedLocked: Torch status changed for cameraId=0, newStatus=0
07-09 07:43:54.805   567  1168 I QCamera : <HAL><INFO> cameraDeviceOpen: 416: Open camera id 0 API version 768
07-09 07:43:54.806   567  1168 D vndksupport: Loading /vendor/lib/hw/power.default.so from current namespace instead of sphal namespace.
07-09 07:43:54.807  1676  1676 V SettingsProvider: Notifying for 0: content://settings/secure/flashlight_available
07-09 07:43:54.807   564 13482 D audio_hw_primary: adev_set_parameters: enter: cameraFacing=back
07-09 07:43:54.808   564 13482 D msm8916_platform: platform_set_parameters: enter: - cameraFacing=back
07-09 07:43:54.808   564 13482 D msm8916_platform: platform_set_parameters:mic_ret: -2
07-09 07:43:54.808   564 13482 D msm8916_platform: platform_set_parameters:loopback_ret: 0
07-09 07:43:54.808   564 13482 D msm8916_platform: platform_set_parameters:rx force device: 0
07-09 07:43:54.808   564 13482 D audio_hw_extn: audio_extn_set_anc_parameters: anc_enabled:0
07-09 07:43:54.808   564 13482 D audio_hw_spkr_prot: audio_extn_fbsp_set_parameters: Speaker protection disabled
07-09 07:43:54.814   567  1168 I QCamera : <HAL><INFO> openCamera: 717: [KPI Perf]: E PROFILE_OPEN_CAMERA camera id 0
07-09 07:43:54.814   687   687 I CameraProviderManager: Camera device device@1.0/legacy/0 torch status is now NOT_AVAILABLE
07-09 07:43:54.814   687   687 I CameraService: onTorchStatusChangedLocked: Torch status changed for cameraId=0, newStatus=0
07-09 07:43:54.814   687   687 I CameraProviderManager: Camera device device@3.3/legacy/0 torch status is now NOT_AVAILABLE
07-09 07:43:54.814   687   687 I CameraService: onTorchStatusChangedLocked: Torch status changed for cameraId=0, newStatus=0
07-09 07:43:54.815   759   759 I mm-camera: < INFO> 395: enable_memleak_trace: start memleak tracking.
07-09 07:43:54.883   759   759 I mm-camera: <MCT   >< INFO> 63: mct_controller_new: Creating new mct_controller with session-id 1
07-09 07:43:54.883   759 16571 I mm-camera: <MCT   >< INFO> 4808: mct_pipeline_start_session_thread: E sensor
07-09 07:43:54.884   759 16571 I mm-camera: <MCT   >< INFO> 4815: mct_pipeline_start_session_thread: Calling start_session on Module sensor
07-09 07:43:54.884   759 16572 I mm-camera: <MCT   >< INFO> 4808: mct_pipeline_start_session_thread: E iface
07-09 07:43:54.884   759 16572 I mm-camera: <MCT   >< INFO> 4815: mct_pipeline_start_session_thread: Calling start_session on Module iface
07-09 07:43:54.885   759 16574 I mm-camera: <MCT   >< INFO> 4808: mct_pipeline_start_session_thread: E isp
07-09 07:43:54.885   759 16574 I mm-camera: <MCT   >< INFO> 4815: mct_pipeline_start_session_thread: Calling start_session on Module isp
07-09 07:43:54.885   759 16574 I mm-camera: <ISP   >< INFO> 205: isp_module_start_session: session id 1
07-09 07:43:54.885   759 16575 I mm-camera: <MCT   >< INFO> 4808: mct_pipeline_start_session_thread: E stats
07-09 07:43:54.885   759 16575 I mm-camera: <MCT   >< INFO> 4815: mct_pipeline_start_session_thread: Calling start_session on Module stats
07-09 07:43:54.886   759 16575 D DmbrContextAPI: VIDHANCE dmbr_create_context user=0xe7b9f780
07-09 07:43:54.886   759 16572 I mm-camera: <MCT   >< INFO> 4818: mct_pipeline_start_session_thread: Module iface start_session rc = 1
07-09 07:43:54.886   759 16572 I mm-camera: <MCT   >< INFO> 4826: mct_pipeline_start_session_thread: started_num = 1, success = 1
07-09 07:43:54.886   759 16572 I mm-camera: <MCT   >< INFO> 4833: mct_pipeline_start_session_thread: X iface
07-09 07:43:54.888   759 16575 E mm-camera: <STATS_AIS ><ERROR> 173: dsps_send_req: DSPS Send Request Timeout!!
07-09 07:43:54.888   759 16574 I mm-camera: <MCT   >< INFO> 4818: mct_pipeline_start_session_thread: Module isp start_session rc = 1
07-09 07:43:54.888   759 16574 I mm-camera: <MCT   >< INFO> 4826: mct_pipeline_start_session_thread: started_num = 2, success = 2
07-09 07:43:54.888   759 16574 I mm-camera: <MCT   >< INFO> 4833: mct_pipeline_start_session_thread: X isp
07-09 07:43:54.888   759 16575 I mm-camera: <MCT   >< INFO> 4818: mct_pipeline_start_session_thread: Module stats start_session rc = 1
07-09 07:43:54.888   759 16575 I mm-camera: <MCT   >< INFO> 4826: mct_pipeline_start_session_thread: started_num = 3, success = 3
07-09 07:43:54.888   759 16575 I mm-camera: <MCT   >< INFO> 4833: mct_pipeline_start_session_thread: X stats
07-09 07:43:54.889   759 16577 I mm-camera: <MCT   >< INFO> 4808: mct_pipeline_start_session_thread: E pproc
07-09 07:43:54.889   759 16577 I mm-camera: <MCT   >< INFO> 4815: mct_pipeline_start_session_thread: Calling start_session on Module pproc
07-09 07:43:54.889   759 16583 I mm-camera: <MCT   >< INFO> 4808: mct_pipeline_start_session_thread: E imglib
07-09 07:43:54.889   759 16583 I mm-camera: <MCT   >< INFO> 4815: mct_pipeline_start_session_thread: Calling start_session on Module imglib
07-09 07:43:54.890   759 16577 I mm-camera: <MCT   >< INFO> 4818: mct_pipeline_start_session_thread: Module pproc start_session rc = 1
07-09 07:43:54.890   759 16583 I mm-camera: <MCT   >< INFO> 4818: mct_pipeline_start_session_thread: Module imglib start_session rc = 1
07-09 07:43:54.890   759 16583 I mm-camera: <MCT   >< INFO> 4826: mct_pipeline_start_session_thread: started_num = 4, success = 4
07-09 07:43:54.890   759 16583 I mm-camera: <MCT   >< INFO> 4833: mct_pipeline_start_session_thread: X imglib
07-09 07:43:54.890   759 16577 I mm-camera: <MCT   >< INFO> 4826: mct_pipeline_start_session_thread: started_num = 5, success = 5
07-09 07:43:54.890   759 16577 I mm-camera: <MCT   >< INFO> 4833: mct_pipeline_start_session_thread: X pproc
07-09 07:43:54.893   759 16571 I mm-camera: <MCT   >< INFO> 4818: mct_pipeline_start_session_thread: Module sensor start_session rc = 1
07-09 07:43:54.893   759 16571 I mm-camera: <MCT   >< INFO> 4826: mct_pipeline_start_session_thread: started_num = 6, success = 6
07-09 07:43:54.894   759 16571 I mm-camera: <MCT   >< INFO> 4833: mct_pipeline_start_session_thread: X sensor
07-09 07:43:54.899   759   759 I mm-camera: <MCT   >< INFO> 4729: mct_pipeline_start_stream_internal: Adding session stream streamid= 0xf for session=1
07-09 07:43:54.899   759   759 I mm-camera: <MCT   >< INFO> 4777: mct_pipeline_start_stream_internal: Linking session stream for session 1
07-09 07:43:54.899   759   759 I mm-camera: <MCT   >< INFO> 510: mct_stream_start_link: Start linking Session-stream 0x1000f
07-09 07:43:54.899   759   759 I mm-camera: <ISP   >< INFO> 801: isp_port_check_caps_reserve: port 0xf1731380 ide 1000f type 10 dim 0 0
07-09 07:43:54.899   759   759 I mm-camera: <PPROC >< INFO> 446: pproc_port_add_modules_to_stream: in identity 1000f stream 10 int_link = 0xf1758b40
07-09 07:43:54.899   759   759 I mm-camera: <PPROC >< INFO> 458: pproc_port_add_modules_to_stream: :LINK linking mods tmod and ppeiscore for identity 1000f
07-09 07:43:54.900   759   759 I mm-camera: <PPROC >< INFO> 458: pproc_port_add_modules_to_stream: :LINK linking mods ppeiscore and c2d for identity 1000f
07-09 07:43:54.900   759   759 I mm-camera: <C2D   >< INFO> 1490: c2d_module_notify_add_stream: width 0, height 0, stride 0, scanline 0, is_type 0
07-09 07:43:54.900   759   759 I mm-camera: <PPROC >< INFO> 458: pproc_port_add_modules_to_stream: :LINK linking mods c2d and cpp for identity 1000f
07-09 07:43:54.900   759   759 I mm-camera: <CPP   >< INFO> 2155: cpp_module_notify_add_stream: :width 0, height 0, stride 0, scanline 0, framelen 0
07-09 07:43:54.900   759   759 I mm-camera: <CPP   >< INFO> 2320: cpp_module_notify_add_stream: : stream 10, fmt 1, asf_mode 0, sharpness_level 0.000000,asf mask 0, denoise 0, denoise_mask 0, dsdn mask 0,dsdn enable 0, tnr mask 0, tnr enable 0, ds_mask 0
07-09 07:43:54.900   759   759 I mm-camera: <PPROC >< INFO> 458: pproc_port_add_modules_to_stream: :LINK linking mods cpp and paaf for identity 1000f
07-09 07:43:54.900   759   759 I mm-camera: <PPROC >< INFO> 458: pproc_port_add_modules_to_stream: :LINK linking mods paaf and sw_tnr for identity 1000f
07-09 07:43:54.900   759   759 I mm-camera: <PPROC >< INFO> 458: pproc_port_add_modules_to_stream: :LINK linking mods sw_tnr and llvd for identity 1000f
07-09 07:43:54.901   759   759 I mm-camera: <PPROC >< INFO> 458: pproc_port_add_modules_to_stream: :LINK linking mods llvd and ezt for identity 1000f
07-09 07:43:54.901   759   759 I mm-camera: <PPROC >< INFO> 458: pproc_port_add_modules_to_stream: :LINK linking mods ezt and quadracfa for identity 1000f
07-09 07:43:54.901   759   759 E mm-camera: <STATS ><ERROR> 2834: stats_port_check_caps_reserve: Invalid Port capability type!
07-09 07:43:54.901   759   759 I chatty  : uid=1006(camera) mm-qcamera-daem identical 3 lines
07-09 07:43:54.901   759   759 E mm-camera: <STATS ><ERROR> 2834: stats_port_check_caps_reserve: Invalid Port capability type!
07-09 07:43:54.902   759   759 I mm-camera: <MCT   >< INFO> 4786: mct_pipeline_start_stream_internal: Session stream linked successfully session 1
07-09 07:43:54.904   575   600 I SDM     : ResourceImpl::SetMaxBandwidthMode: new bandwidth mode=1
07-09 07:43:54.906   567  1168 I QCamera : <HAL><INFO> openCamera: 727: [KPI Perf]: X PROFILE_OPEN_CAMERA camera id 0, rc: 0
07-09 07:43:54.906   567  1168 I QCamera : <HAL><INFO> initialize: 949: E :mCameraId = 0 mState = 1
07-09 07:43:54.907   567  1168 I QCamera : <HAL><INFO> initialize: 982: X
07-09 07:43:54.907   567  1168 E /vendor/bin/hw/android.hardware.camera.provider@2.4-service: Failed to get IAshmemDeviceService.
07-09 07:43:54.907   567  1168 E /vendor/bin/hw/android.hardware.camera.provider@2.4-service: Failed to get IAshmemDeviceService.
07-09 07:43:54.909   424   424 I hwservicemanager: getTransport: Cannot find entry vendor.lineage.camera.motor@1.0::ICameraMotor/default in either framework or device manifest.
07-09 07:43:54.963 16521 16521 D AVideoStream: CameraDevice onOpened
Consti10 commented 4 years ago

Yes must be the bug mentioned above. Try the latest commit from master. In VideoStream provider.

marclava commented 4 years ago

Oops, you're right; I forgot to git pull...

So yes, now it works, even using a router, and the latency is small, but there's artifacts with fast motion (when the scene is changing a lot). When using a hotspot connection, it's better (the latency seems smaller, without artifacts).

The receiver must be started first (using the raw protocol), then the streamer; with a RTP stream (using gstreamer), the receiver can be started after streaming is started. If the screen of the receiver is turned off, the stream doesn't show anymore when the screen is turned back on, so the whole process must be restarted.

I lowered a few streaming settings (FPS and bitrate), and it marginally improved the results.

Very impressive work! :-)

Consti10 commented 4 years ago

Artifacts are created by wifi packet loss (UDP) You can play with FEC to make it more reliable.

To start the receiver after the server (or restart the receiver), the codec configuration data must be sent in intervalls instead of once at the beginning of the stream. See csd-0 csd-1 in the decoder code.

Consti10 commented 4 years ago

Actually, looks like my pixel 3 has the following flag KEY_PREPEND_HEADER_TO_SYNC_FRAMES and therefore allows re-starting the stream without problems. I'l add a fix for devices without this flag.

marclava commented 4 years ago

The flag was added to Android 10: KEY_PREPEND_HEADER_TO_SYNC_FRAMES

Consti10 commented 4 years ago

Added a workaround for that to the video producer app.

marclava commented 4 years ago

It works, thanks.

Now I'll have to play with FEC... All I found is: https://github.com/wangyu-/UDPspeeder

Edit: I just tried speederv2 on my phone; I had to compile it in termux (because the released android binary was crashing). On the receiving side, there was tons of corrupted macroblocks; for some reason it doesn't work, so I won't use it.

Consti10 commented 4 years ago

I did some testing with FEC. The short story is: FEC dosn't work over UDP and normal wifi. If you want to see the long story, join our Telegram