awslabs / amazon-kinesis-video-streams-producer-sdk-java

Allows developers to install and customize their connected camera and other devices to securely stream video, audio, and time-encoded data to Kinesis Video Streams
Apache License 2.0
78 stars 76 forks source link

Example for pushing live camera frames directly to kinesis video stream #10

Closed backdoorcodr closed 6 years ago

backdoorcodr commented 6 years ago

From my understanding until I have looked into the code and executed it (until now in Java only, trying in cpp but facing issues which have been reported already), I have understood that the sample application in java reads the stored frames from local drive and pushes them to the kinesis stream? However, what I am more interested a producer application that directly streams from the camera and push it to kinesis. Later on I will be using these streams for performing facial recognition.

For this purpose, I have initiated by myself to write a small class (CameraMediaSource) by keeping ImageFileMediaSource as an example class. However, I would like to know how to create kinesis video from from camera source? Is there any example available other than the method createKinesisVideoFrameFromImage() ? Should I be using JavaCv for this purpose? Or is there anything provided/supported in the current SDK?

Many thanks

backdoorcodr commented 6 years ago

Hello again,

I am trying to do camera configuration using , however getting the null exception at this stage.

I have created createCameraMediaSource() method. This is how I am trying to do and it looks like this at the moment:

final CameraMediaSourceConfiguration configuration =
                new CameraMediaSourceConfiguration.Builder()
                .withFrameRate(30)
                .withCameraFacing(1)
                .withCameraId("logitech")
                .withIsEncoderHardwareAccelerated(false)
                .withRetentionPeriodInHours(2)
                .build();

What is further required to it? I am not sure what is correct or not, therefore I am looking forward to a correct way of doing it.

At the later stage I understand that we should have CameraMediaSource that needs to be available as well; and hence these are the next instructions of my createCameraMediaSource() method:

        final CameraMediaSource mediaSource = new CameraMediaSource();
        mediaSource.configure(configuration);

and in the end, return the mediaSource.

Looking forward to your suggestions / hints / guidance / corrections.

MushMal commented 6 years ago

@backdoorcodr, your understanding is correct - the demo application is simply streaming a set of previously captured frames to demonstrate the streaming capability of the SDK. You are correct that for your purpose you will need to implement a CameraMediaSource (for Video). We are working on a good way of providing a generic support/documentation for the camera for Java - the main issue here is that Java itself doesn't have a generic support for a camera API across the platforms/OS.

One thing I would recommend is to take a look at the implementation on Android - at least it will give you some insight into the implementation that creates a source and an encoder out of which it gets the encoded frames and CPD (codec private data).

Android SDK is located here: https://github.com/aws/aws-sdk-android/tree/master/aws-android-sdk-kinesisvideo

backdoorcodr commented 6 years ago

@MushMal Many thanks for your prompt response.

For a generic documentation for camera, that would be really helpful. I also agree that while working on that, I also realized that Java does not have generic support for camera API. However, for now I am trying to use webcam-capture-api that can perform the streaming from the camera feed by using the Webcam class that wraps webcam device obtained from webcam driver.

For what I am trying to do with this api is also the same; with a small change that once a frame is fetched and frame data is available, it should be creating kinesis video frame out of the data. I am working on that part in CameraFrameSourceclass as well where this webcam instance keep returning the bytes buffer back as soon as the frame data is available.

backdoorcodr commented 6 years ago

Hello @MushMal ,

I am making attempts to stream on the kinesis and I have just opened one of the attempts as a PR #11 so that I can provide you better picture how I am trying to do at the moment. However, kinesis still does not seem to get streams from the producer. From the following statements from debugging console you can take a look and suggest, may be my camera configuration is not correct enough?

DEBUG / KinesisVideo: Received ACK bits: 0
DEBUG / KinesisVideo: Streamed 13425 bytes for stream my-stream
WARN / KinesisVideo: Stream my-stream has been closed
DEBUG / KinesisVideo: Streamed -1 bytes for stream my-stream

In addition to that:


No video fragments were found. Verify that video is currently being streamed and that the fragment timecodes are correct.```
backdoorcodr commented 6 years ago

Hello again,

Any update on this?

I also tried with the latest version and the same error is occurring. CameraFrameSource and CameraMediaSource classes are available in my PR. You can also review my createCameraMediaSource() method to check my camera configuration.

Will that be possible to provide with a demo example to push video streams directly from webcam? If not, what are the steps need to be followed? Any tutorials we can follow to perform that? Is my camera configuration correct( createCameraMediaSource() )?

zhiyua-git commented 6 years ago

Thanks for trying the webcam to send streams to through Kinesis Video Stream. Depending on the format in which your camera is capturing the video, you have to set the FLAGs in the CameraMediaSource.
could you try final int flags = FRAME_FLAG_KEY_FRAME; in line 88 of CameraMediaSource.java? Let's know how it goes.

backdoorcodr commented 6 years ago

@zhiyua-git Thank you very much for your response. Your response was super helpful to make some progress on this issue. I will be a bit brief here to describe the effort I made further on and the problem I am facing, along with some questions.

Effort made:

After setting the flag flags = FRAME_FLAG_KEY_FRAME; in the CameraMediaSource; I started to get "EventType":"ERROR" that made me realize that the format of the captured video frame is YUV. I investigated a bit further on creating a stream encoder that can fetch the frame in one YUV and return in H264 encoded frames. Luckily, I was able to find an example on H264StreamEncoder in the webcam-capture-api; and I am using that now.

Result:

The results seem quite positive, however we are still not there to the final result yet :) Result up till now is that the stream seems to be getting received by the AWS kinesis when I read the Fragment Ack statuses (which includes BUFFERING, RECEIVED and PERSISTENT statuses), a small portion is pasted below:

DEBUG / KinesisVideo: Received ACK bits: 75
{"EventType":"BUFFERING","FragmentTimecode":2148,"FragmentNumber":"91343852333186758436582441828464509038583822565"}

DEBUG / KinesisVideo: PutFrame index: 53, pts: 15141139501600000, dts: 15141139501600000, duration: 200000, keyFrame: true, flags: 1
DEBUG / KinesisVideo: Kinesis Video client and stream metrics
    >> Overall storage size: 1073741824
    >> Available storage size: 1073688246
    >> Allocated storage size: 53578
    >> Total view allocation size: 35256
    >> Total streams frame rate: 22
    >> Total streams transfer rate: 145707
    >> Current view duration: 0
    >> Overall view duration: 22090000
    >> Current view size: 0
    >> Overall view size: 51918
    >> Current frame rate: 22.28411331046968
    >> Current transfer rate: 145707
DEBUG / KinesisVideo: Data availability notification. Size: 905, Duration 200000
DEBUG / KinesisVideo: Streamed 905 bytes for stream my-stream
DEBUG / KinesisVideo: Received ACK bits: 74
{"EventType":"RECEIVED","FragmentTimecode":2148,"FragmentNumber":"91343852333186758436582441828464509038583822565"}

DEBUG / KinesisVideo: Received ACK bits: 75
{"EventType":"BUFFERING","FragmentTimecode":2189,"FragmentNumber":"91343852333186758441534201985606030152483799839"}

DEBUG / KinesisVideo: PutFrame index: 54, pts: 15141139502050000, dts: 15141139502050000, duration: 200000, keyFrame: true, flags: 1
DEBUG / KinesisVideo: Kinesis Video client and stream metrics
    >> Overall storage size: 1073741824
    >> Available storage size: 1073687308
    >> Allocated storage size: 54516
    >> Total view allocation size: 35256
    >> Total streams frame rate: 22
    >> Total streams transfer rate: 145707
    >> Current view duration: 0
    >> Overall view duration: 22530000
    >> Current view size: 0
    >> Overall view size: 52823
    >> Current frame rate: 22.305548600701623
    >> Current transfer rate: 145707
DEBUG / KinesisVideo: Data availability notification. Size: 806, Duration 200000
DEBUG / KinesisVideo: Streamed 806 bytes for stream my-stream
DEBUG / KinesisVideo: Received ACK bits: 74
{"EventType":"RECEIVED","FragmentTimecode":2189,"FragmentNumber":"91343852333186758441534201985606030152483799839"}

DEBUG / KinesisVideo: Received ACK bits: 75
{"EventType":"BUFFERING","FragmentTimecode":2238,"FragmentNumber":"91343852333186758446485962142747551264616594135"}

DEBUG / KinesisVideo: PutFrame index: 55, pts: 15141139502530000, dts: 15141139502530000, duration: 200000, keyFrame: true, flags: 1
DEBUG / KinesisVideo: Kinesis Video client and stream metrics
    >> Overall storage size: 1073741824
    >> Available storage size: 1073686469
    >> Allocated storage size: 55355
    >> Total view allocation size: 35256
    >> Total streams frame rate: 22
    >> Total streams transfer rate: 140819
    >> Current view duration: 0
    >> Overall view duration: 22980000
    >> Current view size: 0
    >> Overall view size: 53629
    >> Current frame rate: 22.284028006646768
    >> Current transfer rate: 140819
DEBUG / KinesisVideo: Data availability notification. Size: 769, Duration 200000
DEBUG / KinesisVideo: Streamed 769 bytes for stream my-stream
DEBUG / KinesisVideo: Received ACK bits: 74
{"EventType":"RECEIVED","FragmentTimecode":2238,"FragmentNumber":"91343852333186758446485962142747551264616594135"}

DEBUG / KinesisVideo: Received ACK bits: 75
{"EventType":"PERSISTED","FragmentTimecode":2148,"FragmentNumber":"91343852333186758436582441828464509038583822565"}

DEBUG / KinesisVideo: Received ACK bits: 75
{"EventType":"PERSISTED","FragmentTimecode":2189,"FragmentNumber":"91343852333186758441534201985606030152483799839"}

...

Git commit:

Here is my latest commit : f138814a835788afd0cc10d3220f7821ab38c81d

Problem:

Again, the results seem quite positive and some progress is there, however we are still not there to the final result yet :)

The fragment did not contain any codec private data.: I have multiple questions here:

The screenshot is attached here: screenshot from 2017-12-24 11-59-59

Questions:

  1. What could be an issue here? Is the encoding not done properly? As per my understanding while looking at FragmentAckType class, RECEIVED ack is returned only when fragment has been received and parsed. If the encoding is not done properly than is it possible to decode the fragment?

  2. Is there any support that could be provided here for stream encoding that can handle at-least basic stream encoding from webcam's native MJPEG,YUV420p , YUY2 to H264 ? Is there any other encoding format supported to stream on Kinesis video stream other than h264?

  3. Any pointers, suggestions to counter this issue?

unicornss commented 6 years ago

Glad you are able to set the FLAGs and stream with H264 encoding. You also need to set the correct CodecPrivateData based on your Media Source in the DemoAppMain.java (e.g.) final CameraMediaSourceConfiguration configuration = new CameraMediaSourceConfiguration.Builder() .withFrameRate(FPS_22) .withRetentionPeriodInHours(1) .withCameraId("/dev/video0") .withIsEncoderHardwareAccelerated(false) .withEncodingMimeType("video/avc") .withNalAdaptationFlags(StreamInfo.NalAdaptationFlags.NAL_ADAPTATION_FLAG_NONE) .withIsAbsoluteTimecode(false) .withEncodingBitRate(200000) .withHorizontalResolution(640) .withVerticalResolution(480) .withCodecPrivateData( ) .build(); Refefrence:

-SS

backdoorcodr commented 6 years ago

@unicornss That was a very good pointer. Many thanks for that, SS. Your pointers helped me to understand a bit more clearer and do the camera configuration correctly; it works now. Now I am able to push live webcam streams on Kinesis :) However, a bit slow streaming at the moment may be because i am on us-west region; this situation may change once I will switch it to eu-west region.

Here is the implementation of branch : https://github.com/backdoorcodr/amazon-kinesis-video-streams-producer-sdk-java

This can also be used an example on how to : a. implement camera configuration; and other helper classes b. perform conversion from webcam native encoding to H264

sachinrokade commented 6 years ago

Hello sir, I am new in aws kniesis so can you plz tell me step by step procedure or any video tutorial to streaming live cctv camera video on kniesis , i read all type of kniesis documentation but cant understand, hope i well get positive feedback . and i got this error msg when i start my kniesis instance No fragments found No video fragments were found. Verify that video is currently being streamed and that the fragment timecodes are correct.

Thank u.

bdhandap commented 6 years ago

Hey Sachin,

"No video fragments were found." error means that there is no active producer ingesting data into Kinesis Video streams. Couple of troubleshooting tips when you get this error :

  1. You can use the sample GStreamer app to ingest video into KVS from you webcam. Your producer should be continuously running if you want to see the live video.
  2. You can also playback the older fragments by clicking earliest or selecting appropriate timestamp.
  3. Also make sure the region of the KVS console and the region configured in the producer app are the same (eg: us-west-2)

Thanks Babu

mkhajuriwala commented 6 years ago

I downloaded the code from https://github.com/backdoorcodr/amazon-kinesis-video-streams-producer-sdk-java and generated KinesisVideoProducerJNI with the help of amazon-kinesis-video-streams-producer-sdk-cpp but when i run the DemoAppMain class it throws me an error saying Creating Kinesis Video client. setCallbacks(): Couldn't find method id streamDataAvailable throwNativeException(): Had to clear a pending exception found when throwing "Failed to set the callbacks." (code 0x0) throwNativeException(): Throwing com/amazonaws/kinesisvideo/producer/ProducerException with message: Failed to set the callbacks. Exception in thread "main" java.lang.RuntimeException: com.amazonaws.kinesisvideo.producer.ProducerException: Failed to set the callbacks. at com.amazonaws.kinesisvideo.demoapp.DemoAppMain.main(DemoAppMain.java:62) Caused by: com.amazonaws.kinesisvideo.producer.ProducerException: Failed to set the callbacks. at com.amazonaws.kinesisvideo.producer.jni.NativeKinesisVideoProducerJni.createKinesisVideoClient(Native Method) at com.amazonaws.kinesisvideo.producer.jni.NativeKinesisVideoProducerJni.create(NativeKinesisVideoProducerJni.java:219) at com.amazonaws.kinesisvideo.producer.jni.NativeKinesisVideoProducerJni.create(NativeKinesisVideoProducerJni.java:186) at com.amazonaws.kinesisvideo.client.NativeKinesisVideoClient.initialize(NativeKinesisVideoClient.java:112) at com.amazonaws.kinesisvideo.java.client.KinesisVideoJavaClientFactory.createKinesisVideoClient(KinesisVideoJavaClientFactory.java:106) at com.amazonaws.kinesisvideo.java.client.KinesisVideoJavaClientFactory.createKinesisVideoClient(KinesisVideoJavaClientFactory.java:79) at com.amazonaws.kinesisvideo.demoapp.DemoAppMain.main(DemoAppMain.java:47)

Any idea what this means and how can i resolve it Please help

zhiyua-git commented 6 years ago

@mkhajuriwala Are you using latest version of both Java Producer SDK and C++ Producer SDK? If not, please update to latest and try.

If you are using latest code, are you modifying some Java layer code, say overriding StreamCallbacks? The error specify one callback function could not be found in Java class passed to JNI layer.

mkhajuriwala commented 6 years ago

@zhiyua-git initially this was the error that i got This app is built to run with version 1.2 of the libKinesisVideoProducerJNI.so library, but version 1.5 was found on this device To overcome this i changed the version in class NativeKinesisVideoProducerJni private static final String EXPECTED_LIBRARY_VERSION = "1.5"; Other than that i have not done any changes and yes i have cloned the latest version

zhiyua-git commented 6 years ago

@mkhajuriwala You should not need to manually update the expected version inside the code, that is used to avoid this kind of weird linking issue you saw in JNI. Please update your Java producer SDK to latest version, it is already version 1.5, not what you have locally as version 1.2.

mkhajuriwala commented 6 years ago

@zhiyua-git Thank you so much for the help i downloaded the latest version and i am able to stream the video but this is what i am getting now screen shot 2018-04-08 at 3 17 29 pm i even changed the first 4 bytes of a video frame are 0x00 0x00 0x00 0x01 and got the following exception screen shot 2018-04-08 at 3 22 22 pm

MushMal commented 6 years ago

@mkhajuriwala are you running the Java demo application? If not, what's your media source? Generally, the encoders use Annex-B format whereas higher-level utilizes AvCC format for packaging NALUs. The console playback will only playback h264 video in AvCC format.

Please provide more information so we can help better

vatsalya1 commented 6 years ago

Hi I also get the following - This app is built to run with version 1.2 of the libKinesisVideoProducerJNI.so library, but version 1.5 was found on this device

How do I update the Java producer SDK to latest version. I am using eclipse

MushMal commented 6 years ago

@vatsalya1 this is an indication that you have a mismatch between the Java and the native built library versions. Please try to pull down the latest Java SDK and ensure you use the latest native libraries.

Please refer to https://github.com/awslabs/amazon-kinesis-video-streams-producer-sdk-java#building-from-source for building the native libraries yourself if you need to.

parvezkhusro commented 5 years ago

@unicornss That was a very good pointer. Many thanks for that, SS. Your pointers helped me to understand a bit more clearer and do the camera configuration correctly; it works now. Now I am able to push live webcam streams on Kinesis :) However, a bit slow streaming at the moment may be because i am on us-west region; this situation may change once I will switch it to eu-west region.

Here is the implementation of branch : https://github.com/backdoorcodr/amazon-kinesis-video-streams-producer-sdk-java

This can also be used an example on how to : a. implement camera configuration; and other helper classes b. perform conversion from webcam native encoding to H264

@backdoorcodr I am able to get the stored images to the Kinesis stream. But I need to stream video from an ip cam or laptop camera. For this, I tried from the link that you have provided to your repo. But I am getting An exception occured while executing the Java class. FATAL DEPLOYMENT ERROR: This app is built to run with version 1.2 of the libKinesisVideoProducerJNI.so library, but version 1.8 was found on this device . After I updated the sdk, CameraMediaSource is missing for me!

Can you please guide me on this what to do next to resolve this?

huynv2909 commented 5 years ago

Hello every one, I clone and build @backdoorcodr code, but my project missing CameraMediaSource class, I can not found it. Can someone help me!

BodapatlaKiranReddy commented 3 years ago

@MushMal @backdoorcodr The latest SDK doesn't have CameraMediaSource class hence, cannot attach webcam/laptop cam as a media source. is there any update on java to push live video stream from webcam to kinesis video stream? please help me how do it with java