I want to create a async camera wrapper over nokhwa::Camera. In my implementation, I do the streaming part using tokio::sync::watch and user can set control value using MyCamera.
My goal is that changing any control, interacting with the camera should never affect the frame rate.
I can easily do that on windows since the MF backend allow to open multiple nokhwa::Camera, but only one stream can be enabled. So I create 2 nokhwa::Camera, one for the task dealing with the streaming, one with the task dealing with the property readwrite. Both living in 2 different thread.
I can achieve stable 30fps (this is what my uvc camera deliver) while constantly fuzzing the control with new values.
Problem
Now I take the same code on linux with v4l, and it doesn't work. Since v4l only allow one device to be opened at the same time. I tried alternate usage using CallbackCamera, but this is still the same problem as there is a mutex on the internal nokhwa::Camera to access the frame.
Proposed solution
My idea is to update nokhwa-bindings-linux/ to keep a global pool of every opened devices (v4l::Device). And remove the device: Device field of V4LCaptureDevice. And MmapStream can still be created by only one V4LCaptureDevice. And it will be possible when calling frame() to not depend on any v4l::Device, making as fast as possible, lock-free.
What I'm doing
I know there is a plan for async Camera, but it is not there yet.
I will implement my proposed solution in my fork for my own needs
I will be happy to contribute back, if you think this is a good idea. I think having windows & linux behaving the same way is essential for a cross platform library.
I will be happy to contribute my async camera implementation if you are interested. My async implementation is interested in the way that it is non invasive of low level bindings, that are sync.
So please let me know what is your opinion about all of this. I know you are short in time. I want to take this code in production for my company so I will be happy to support it in the future on my company time.
Use case
I want to create a async camera wrapper over
nokhwa::Camera
. In my implementation, I do the streaming part using tokio::sync::watch and user can set control value usingMyCamera
.My goal is that changing any control, interacting with the camera should never affect the frame rate.
I can easily do that on windows since the MF backend allow to open multiple
nokhwa::Camera
, but only one stream can be enabled. So I create 2nokhwa::Camera
, one for the task dealing with the streaming, one with the task dealing with the property readwrite. Both living in 2 different thread.I can achieve stable 30fps (this is what my uvc camera deliver) while constantly fuzzing the control with new values.
Problem
Now I take the same code on linux with v4l, and it doesn't work. Since v4l only allow one device to be opened at the same time. I tried alternate usage using
CallbackCamera
, but this is still the same problem as there is a mutex on the internalnokhwa::Camera
to access the frame.Proposed solution
My idea is to update nokhwa-bindings-linux/ to keep a global pool of every opened devices (
v4l::Device
). And remove thedevice: Device
field ofV4LCaptureDevice
. AndMmapStream
can still be created by only oneV4LCaptureDevice
. And it will be possible when callingframe()
to not depend on anyv4l::Device
, making as fast as possible, lock-free.What I'm doing
So please let me know what is your opinion about all of this. I know you are short in time. I want to take this code in production for my company so I will be happy to support it in the future on my company time.