Open ricardosutana opened 1 year ago
Yes so this is a h264 stream - and in some configuration you'll get 2 streams (stream_index) - i.e. 3072x3072 configurations - 2 streams corresponding to each sensor - you can take any library to decode the stream, I used ffmpeg to do it - taken from ffmpeg examples - I had to find the right decoder and ffmpeg library to get nvidia decoder 'h264_cuvid' - but I started by the standard decoder, it worked to but consumes a lot of CPU - precompiled ffmpeg : follow this example - decode_video.c - instead of reading from the file , feed the stream we got from insta libraries. The example references a MPEG1 decoding you need to change it to decode H264. on my side I considered I wanted to pick a proper decoder I used one of those method (cuvid works with NVidia GPUs - pick the one that works for your configuration):
/* find the H264 video decoder */
codec =/*vcodec_find_decoder(AV_CODEC_ID_H264)*/ avcodec_find_decoder_by_name("h264_cuvid");
if (!codec) {
fprintf(stderr, "Codec not found\n");
exit(1);
}
Once decoded you can inject the image in whichever framework you need to do processing - note: if you take the result back and forth from the GPU to the CPU it might not be optimal .
Hi I'm working on camera sdk to get the fisheye images. I was able to compile and test the sdk functionalities. Looking into function onVideoData we have
void OnVideoData(const uint8_t* data, size_t size, int64_t timestamp, uint8_t streamType, int stream_index = 0)
I suppouse the images are encoded on format h264 and are stored ondata
pointer. How can I decode this image and display it in live stream mode?