nasa-jpl / LiveViewOpenCL

Open Source real-time visualization tools for Imaging Spectrometer development
GNU General Public License v3.0
36 stars 21 forks source link

Add playback controls to file-based camera models #25

Open jackiryan opened 5 years ago

jackiryan commented 5 years ago

Introduction

This is a major modification to the way that LiveView works and I may re-write it as an epic if it's unclear how to implement it. Currently, all LiveView camera models operate under a "streaming" concept; that is, data is streamed from a source continuously until that source has no more data. This way of playing data makes sense for hardware-based streams because the data is being fed directly from a frame grabber board to the software.

For file-based camera models, e.g., the XIO and ENVI cameras, we don't need to stream because instead of a "fire hose" of data, the pixels are stored on a drive, and can be played back like a video.

So, given that the data is really just an uncompressed video, we can add playback controls like a play/pause button, fast forward, rewind, and frame-by-frame. First, let's figure out playing and pausing, which should be pretty simple and can apply to any camera model, even the camera link cameras (EDT and Dalsa).

jackiryan commented 5 years ago

Thoughts on implementation

Memory

Right now, the camera models use a deque to store frames. This makes a lot of sense when you are streaming because you only care about storing frames in the future and throw them out (pop_back) as soon as they are sent out to the FrameWorker and beyond. However, this is not great when a user could go back and request old frames.

Capture loops

The main frame acquisition loops in LiveView, FrameWorker::captureFrames, FrameWorker::captureDSFrames, and FrameWorker::captureSDFrames all work as quickly as possible to capture frames. When streaming, this is what you want because the data should be processed as soon as it is ready so we can always display the most recent data. If there is playing and pausing, however, there is no longer a need to always have these loops running. Therefore, they should be more like one shots, perhaps with a bit of buffering. By "one shot" I mean that there would be a function that acquires one frame from the camera, or processes one frame of standard deviation, and instead of having a dumb loop calling these various functions over and over, you have more of a system that fills a frame buffer on demand. Likewise, the playback controls are really controlling which frame to display, rather than when to capture frames.

jackiryan commented 3 years ago

Notes from October 8th meeting

Philip suggested that this issue be implemented using semaphores from the OS. I think a more portable solution might be to use atomic variables with flags to indicate when to play and pause the stream similar flags could be used to make the captureFrames work function run as a single shot for skipping through one frame at a time. The atomic I am referring to for this implementation is std::atomic in C++.

The MVP capability should be a play/pause button, a skip one frame forward button, and a skip one frame back button.