l1npengtul / nokhwa

Cross Platform Rust Library for Powerful Webcam/Camera Capture
Apache License 2.0
531 stars 137 forks source link

How to set the number of buffers to read from the camera? #138

Open wangxiaochuTHU opened 1 year ago

wangxiaochuTHU commented 1 year ago

Hello, I currently used this crate for a cross-platform case.

I would like to be able to set the number of buffers to read from the camera, just as V4L2 does, for avoiding missing images.

However, I failed to learn from the examples on how to set that. Can you give any guidelines? Thank you very much.

wangxiaochuTHU commented 1 year ago

I'm sorry, but in my use case the streamed image frames which flow as slow as 5 fps can be lost/dropped, where actually only a 4 μs-duration-operation is spent on each arrival of frames. So I suspect that the bottleneck might be at your lib.

Since it is common for most drivers to obtain a captured image frame-by-frame ( rather than byte-by-byte or line-by-line), I think we should shorten the handling time after we got the frame and keep capturing as soon as possible. Here is the idea based on this rule, how do you think about it (the code pattern is given as below) ?

The lib API looks like

/// set stream on, and return `rx` to user for receiving `Buffer`
pub fn stream_on(&mut self)  -> mpsc::Receiver<Buffer> {
    /* 
        set the stream on 
    */

    let (tx, rx) = mpsc::channel();
    self.tx = Some(tx);
    rx
}

/// infinite loop for requesting the next samples, and use a thread to handle each sample.
pub fn frames(&mut self)  {
     loop {
            let imf_sample: Option<IMFSample> = match unsafe { MFCreateSample() } {
                Ok(sample) => {
                    Some(sample)
                }
                Err(why) => {
                    return Err(NokhwaError::ReadFrameError(why.to_string()));
                }
            };

            /* 
                  here, can we handle imf_sample in a background thread? 
                           e.g. handle_task.send(Wrapper<imf_sample>);
                  thus we can wait for the next frame as soon as possible
            */
     }
}

/// the handle thread
pub fn sample_handle_thread(tx : mpsc::Sender<Buffer>) {
      while let Ok(Wrapper<imf_sample>) = handle_task.recv(){
              /*
                    handle the imf_sample and get `Buffer`
              */
              tx.send(buffer);
      }
}

Thus users can obtain the Buffer by receiving from rx.

wangxiaochuTHU commented 1 year ago

Hello, the frame drop problem in my case finally ends up with setting the priority of the capture thread to "Time Critical" (it's on Windows). By doing this, the drop ratio can decrease from about 500 out of 10000 to a relatively low number, which is about 25 out of 10000.

Actually, my use case was to capture video with resolution 5000x5000 at 5Hz and then display&handle&save the images. I found it hard to understand the process/winapi of Windows camera capturing. Thanks to your crate, it let me achieve this goal much more easier and don't need to turn to Linux. Thank you very much, it is really a great project.