The base station needs to decode and render MJPEG and H264 video formats. Find a cross-platform library that we can use to decode frames and draw them in the OpenGL context.
The incoming format will be a stream of bytes from the rover. The format and dimensions will be known inputs to the encoder
When rendering the video, we must be able to dynamically change the resolution and position.
Once the decode/rendering part is working, we can start making a video stream player module in NanoGUI.
Suggestions and Resources
FFmpeg, a software suite with an extensive collection of libraries for video and audio processing. Documentation is somewhat limited.
video-app, someone's project that uses FFmpeg and GLFW for video rendering.
Find a library and demonstrate the suitability for the base station by December 14th. Video capture and streaming will be completed by this date. Additionally, we may concurrently create a video player with options to select the stream, stop/start the video, change camera settings, etc.. After this is ready, we can add the video rendering to the player.
Objective
The base station needs to decode and render MJPEG and H264 video formats. Find a cross-platform library that we can use to decode frames and draw them in the OpenGL context.
Once the decode/rendering part is working, we can start making a video stream player module in NanoGUI.
Suggestions and Resources
Timeline and Next Steps
Find a library and demonstrate the suitability for the base station by December 14th. Video capture and streaming will be completed by this date. Additionally, we may concurrently create a video player with options to select the stream, stop/start the video, change camera settings, etc.. After this is ready, we can add the video rendering to the player.