dji-sdk / Mobile-SDK-Android

DJI Mobile SDK for Android: http://developer.dji.com/mobile-sdk/
Other
980 stars 579 forks source link

VideoFeeder data callback not called on Mavic 2 / Android SDK 4.7.1 #352

Open straule opened 5 years ago

straule commented 5 years ago

Android SDK 4.7.1 does not call the video feeder's data callback when used with a Mavic 2 Pro.

After connection procedure the callback is set as described in the doc and sample code in the following way: VideoFeeder.getInstance().getPrimaryVideoFeed().setCallback(new VideoFeeder.VideoDataCallback() { @Override public void onReceive(byte[] bytes, int i) { // Never gets called } });

Video decoding is still possible using DJICodecManager as it seems to render the data directly to the surface, but the data never gets passed to the above callback so that the app cannot do its own processing. If the same code is run with another drone (e.g. Mavic Pro), the onReceive() function is called as expected. The behavior can be reproduced with the SDK Sample project. Also in this project the callbacks are never called. Device is Sony Xperia Premium XZ.

Michael-DJI commented 5 years ago

@straule thanks for your feedback, yes you are right, for mavic 2 it render the data directly to the surface.

straule commented 5 years ago

Can this be corrected in the next version of the Android SDK ? As written this behaviour prevents for the app from doing its own processing on the raw video stream. For example in our app we want to share the raw video stream over the network to another device. This does not work any more with the Mavic 2 because of this issue.

Michael-DJI commented 5 years ago

@straule you could use provideTranscodedVideoFeed() method in VideoFeeder class to get a new video feeder, it's standard h264 videoFeeder.

Consti10 commented 5 years ago

The same is true for the dji spark ( when connected via wifi ) More details here: https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample/issues/43

ghost commented 5 years ago

Did anyone manage to make it work with provideTranscodedVideoFeed()? There is still no callback for us and we cannot register a surface from our backend. Also the sample app does not seem to work with Mavic2 Pro either.

Consti10 commented 5 years ago

Hello, Since no dji developer really answered our questions I will share my impressions here: I have an already working fpv vr app for non-dji drones. Last month I bought a DJI spark to start adding support for the DJI franchise. I didnt expect that to be so buggy and hard. As we have already discovered - there is no bug-free way to obtain live video from dji drones. Callbacks are not called, Documentation is missing usw. However, I have the impression their dev team is currently re-working the video receiving component. Why: With the most recent updates to the mobile sdk there were quite a lot of changes regarding video callbacks and this functionality in general. Also the documentation got updated.

Here are some keys that might help others:

Michael-DJI commented 5 years ago

@Consti10 Sorry for the inconvenience, as for your questions

Can this be corrected in the next version of the Android SDK ? As written this behaviour prevents for the app from doing its own processing on the raw video stream. For example in our app we want to share the raw video stream over the network to another device. This does not work any more with the Mavic 2 because of this issue.

actually it's not a bug in Mavice 2, the reason that we can't directly give you the interface to get the raw video stream is because the raw video stream in Mavic 2 could not be decoded outside the MSDK, because we add a logic to request the key frame when decoding it internally, and at the same time the video need be distortion-corrected when decoding, so although you get the raw video it's useless, and that's the reason we add a new interface which is provideTranscodedVideoFeed(), it could provide a standard h264 stream and you could transfer this video stream data to the internet. besides, please refer to this demo about getting the live video: https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample

straule commented 5 years ago

Thanks for feedback, Michael. Let me comment on your points: First there should be an official SDK function to request a key frame from the Mavic. This anyhow should be available in case video decoding needs to be restarted on the app side and would be much more reliable than manually injecting the different keyframes. Second the distortion correction is something we anyhow do on the on app side for our FPV app to adapt to various FPV headsets which also introduce distortion through their lenses. From processing power and latency perspective its far more effective to do this once than doing it twice. Last all this is already possible with the iOS SDK which has a working callback also provides a function to request a keyframe. So do not understand why this is not possible on the Android SDK.

Michael-DJI commented 5 years ago

@straule I got your points, and thanks for your very good advices! yes, as you know the decoding logics are different in Android and iOS sides, right now the Android side is not providing the sufficient interfaces because we were trying to make the interface as simple as possible and avoid the complexity, but this actually lead to the less-flexibility, we will consider to add more interfaces and support more flexibility, please wait for the changes in later version, thanks again!

oscarmore2 commented 5 years ago

I tried to use provideTranscodedVideoFeed to get VideoFeeder Instance, and register the callback to this instance, but the OnReceive doesn't call at all. Please see my Issue here: https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample/issues/45. @Michael-DJI

ghost commented 5 years ago

I managed to trigger the listener by adding this in my code mCodecManager = DJICodecManager(context, SurfaceTexture(0), 100, 100)

I am not sure that using the codec manager should be mandatory to trigger a listener that has no bound to this object.

ricklentz commented 5 years ago

@Consti10 Sorry for the inconvenience, as for your questions

Can this be corrected in the next version of the Android SDK ? As written this behaviour prevents for the app from doing its own processing on the raw video stream. For example in our app we want to share the raw video stream over the network to another device. This does not work any more with the Mavic 2 because of this issue.

actually it's not a bug in Mavice 2, the reason that we can't directly give you the interface to get the raw video stream is because the raw video stream in Mavic 2 could not be decoded outside the MSDK, because we add a logic to request the key frame when decoding it internally, and at the same time the video need be distortion-corrected when decoding, so although you get the raw video it's useless, and that's the reason we add a new interface which is provideTranscodedVideoFeed(), it could provide a standard h264 stream and you could transfer this video stream data to the internet. besides, please refer to this demo about getting the live video: https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample

Hello @Michael-DJI, Can you please provide an update regarding the VideoFeeder callback? It seems @raphael-pix4d found a workaround to the noted callback issue with provideTranscodedVideoFeed(), but more to @straule's point about the API, it would be great to have the existing API callbacks working. Your thoughts?

hblanken commented 5 years ago
  1. Which drone products need lens antidistortion applied? All or only Mavic2 zoom?
  2. Why is "VideoFeeder.VideoDataListener.OnReceive" callback not called when using "provideTranscodedVideoFeed" on Mavic 2?
Michael-DJI commented 5 years ago

@hblanken 1.there are three drones which need anti-distortion: Mavic 2 Pro, Mavic2 Zoom, Mavic2 Enterprise Zoom 2.because the OnReceive() method in TranscodedVideoFeed will be called only when there is a surface or texture is displaying the live video, the technical reason is that in the background, we are decoding the live stream first with a decoder( it also deal with the anti-distortion and I-frame requesting) then encode it with a encoder, the data got from OnReceive() method in TranscodedVideoFeed is actually got from the encoder, so you need make sure that the first decoder is working properly.

Michael-DJI commented 5 years ago

we actually are refining this part of logic, thanks for all the feedbacks! BTW in 4.9 as you know (if you have read the release notes) we have provided a LiveStreamManager class which could help to streaming the live view to a RTMP server, in the sample of this repo, you could find how to use it and how to change the video source in LiveStreamManager: https://github.com/dji-sdk/Mobile-SDK-Android/blob/master/Sample%20Code/app/src/main/java/com/dji/sdk/sample/demo/camera/LiveStreamView.java this might could solve some of your problems, thanks again!

mordka commented 5 years ago

Hi @Michael-DJI, please keep us updated on new API that will allow consumption of raw h264 streams. I tried LiveStreamView - but getting error number -3 from the following call:

int result = DJISDKManager.getInstance().getLiveStreamManager().startStream();

What its the meaning for error -3? Could you post the description for all the error codes?

Michael-DJI commented 5 years ago

@mordka which drone were you using? pls try to call setVideoEncodingEnabled() before startStream.

mordka commented 5 years ago

Hi @Michael-DJI, I'm testing both on Mavic Pro and Mavic 2 Enterprise Dual. I can stream on Mavic Pro if I remove the VideoFeeder logic at all. I would like to know the error codes description to understand it better.

Michael-DJI commented 5 years ago

@mordka -3 means can't get sps/pps info

sharvashish commented 5 years ago

Hello guys,

Any update on this regard? Do we have to use provideTranscodedVideoFeed() for Mavic 2 series?

VideoFeeder.getInstance().getPrimaryVideoFeed().addVideoDataListener(mReceivedVideoDataListener);

will this not work?

Also, if I have DJI-UXSDK-4.9 (instead of DJI-SDK 4.9) included in my build.gradle, the transcoded video feed would still work. right?

Consti10 commented 5 years ago

Unfortunately, my issues with the dji spark #369 still persist: DJICodecManager is mandatory for getting live video data callbacks, initializing a fake DJICodecManager: codecManager = new DJICodecManager(this.getContext(), null, 0, 0, isPrimaryVideoFeed ? UsbAccessoryService.VideoStreamSource.Camera : UsbAccessoryService.VideoStreamSource.Fpv);

Makes the callback active, but results in crashes when initalizing a real decoder.

The startStream(); function int result = DJISDKManager.getInstance().getLiveStreamManager().startStream(); doesnt change anything, and when looking at the source code its implementation is as follow:

public int startStream() { return 0; }

WTF ?!

oscarmore2 commented 5 years ago

I updated sdk to 4.10, but listener of provideTranscodedVideoFeed() feeder is not calling at all. reference: https://github.com/DJI-Mobile-SDK-Tutorials/Android-VideoStreamDecodingSample/issues/45#issuecomment-527776476

keivanh commented 4 years ago

I have same problem with 4.11.1, It is strange everything on "Sample code" works fine but without VideoFeeder.VideoDataListener.onReceive being called!

Actually I removed the listener totally and still I get frames on canvas! It seems it is enough to just initiate a DJICodecManager object in your code and it gets the data magically! This is really bad design. I spent 2 days to figure this out

To get frames I set codecManager.enabledYuvData(true);

codecManager.setYuvDataCallback(new DJICodecManager.YuvDataCallback() {
                @Override
                public void onYuvDataReceived(MediaFormat mediaFormat, ByteBuffer byteBuffer, int i, int i1, int i2) {
                    Log.d("DJIFRAME", "Got new Yuv frame " + i + "  " + i1 + "   " + i2 );
                }
            });
jeryini commented 4 years ago

Hello!

Came to exact same conclusion as @keivanh. Setting callback for setYuvDataCallback actually works, but I'm having some issue processing received byteBuffer. @keivanh how are you processing it in the method onYuvDataReceived? Specifically I want to access byte array, i.e. yuvFrame.array() but instead I see lots of messages in logs from CCodecConfig.

keivanh commented 4 years ago

Hello!

Came to exact same conclusion as @keivanh. Setting callback for setYuvDataCallback actually works, but I'm having some issue processing received byteBuffer. @keivanh how are you processing it in the method onYuvDataReceived? Specifically I want to access byte array, i.e. yuvFrame.array() but instead I see lots of messages in logs from CCodecConfig.

For example:

public void onYuvDataReceived(MediaFormat mediaFormat, ByteBuffer frame, int i, int i1, int i2) {
byte[] dataBuffer = new byte[frame.remaining()];
frame.get(dataBuffer, 0, i); // i is frame length in bytes

I am doing this because I need direct buffer to send down to NDK module, but I think if you work with ByteBuffer it gives you better performance.

jeryini commented 4 years ago

Hey! Thank you for this, just wanted to write that I figured it out that you indeed need to call yuvFrame.get(data); to get the byte array from frame.

manuoso commented 4 years ago

I have same problem with 4.11.1, It is strange everything on "Sample code" works fine but without VideoFeeder.VideoDataListener.onReceive being called!

Actually I removed the listener totally and still I get frames on canvas! It seems it is enough to just initiate a DJICodecManager object in your code and it gets the data magically! This is really bad design. I spent 2 days to figure this out

To get frames I set codecManager.enabledYuvData(true);

codecManager.setYuvDataCallback(new DJICodecManager.YuvDataCallback() {
                @Override
                public void onYuvDataReceived(MediaFormat mediaFormat, ByteBuffer byteBuffer, int i, int i1, int i2) {
                    Log.d("DJIFRAME", "Got new Yuv frame " + i + "  " + i1 + "   " + i2 );
                }
            });

Hi! How you do that? I means, how you initialize a DJICodecManager using Canvas? I want to initialize DJICodecManager without any TextureViewer or SurfaceViewer

keivanh commented 4 years ago

In simplest form this would be enough to get YUV frame: https://gist.github.com/keivanh/2d4309f5fa88dc520264c21cd10bbb43

manuoso commented 4 years ago

@keivanh Oh, thats good, Thank you!!!!

jeryini commented 4 years ago

Hello @keivanh!

As mentioned above, I'm using this technique to get YUV data f rame. But for some unknown reason I'm getting gray image with green/violet artifacts. It has to do something with codecs, but I haven't been able to come up with solution. Do you have any idea? See example: example

keivanh commented 4 years ago

Check this issue #499 I have explained partially same problem, inside "MediaFormat" you get "ColorSpace" code, that could be different for each device/Android version. Up to here I have seen 19 and 21, referring to YUV_I420 and YUV_NV12 respectively. See #422

You have to switch to different decoder based on the value of ColorSpace, I am using OpenCV NDK code to decode, but it is extremely overkill for most of use cases. There are several Java decoders to do same thing.

jeryini commented 4 years ago

Interesting, didn't know about this issue. Many thanks for sharing this info, will definitely take a look at it.

Andreas1331 commented 1 year ago

I have same problem with 4.11.1, It is strange everything on "Sample code" works fine but without VideoFeeder.VideoDataListener.onReceive being called!

Actually I removed the listener totally and still I get frames on canvas! It seems it is enough to just initiate a DJICodecManager object in your code and it gets the data magically! This is really bad design. I spent 2 days to figure this out

To get frames I set codecManager.enabledYuvData(true);

codecManager.setYuvDataCallback(new DJICodecManager.YuvDataCallback() {
                @Override
                public void onYuvDataReceived(MediaFormat mediaFormat, ByteBuffer byteBuffer, int i, int i1, int i2) {
                    Log.d("DJIFRAME", "Got new Yuv frame " + i + "  " + i1 + "   " + i2 );
                }
            });

Which drone are you using? I'm on a Mavic 2 Enterprise, but this solution does not trigger the YuvDataCallback() ... EDIT: The event is fired when using Usb as the last parameter, but when trying to use the VideoSource from DJICodecManager it stops working.