Open DanielMSchmidt opened 1 year ago
Camera should be created once, and for your application a CallbackCamera
would be appropriate to not block the main thread waiting on a frame. Once you get a buffer, you should decode it using RgbFormat
to get a ImageBuffer<Rgb<u8> Vec<u8>>
, which should be trivial to display in any image and/or canvas2d widget.
senpai
is pretty close to release, just need to fix nv12 and make sure AVFoundation works.
senpai
is pretty close to release, just need to fix nv12 and make sure AVFoundation works.
When I use nokhwa 0.10.3
, the following error is prompted. Is the conversion of nv12 to rgb format still not available?
Captured Single Frame of 4147200
Frame format: NV12
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: ProcessFrameError { src: NV12, destination: "RGB", error: "bad input buffer size" }', src/main.rs:97:53
Thank you sincerely.
The NV12 decoder is likely still somewhat buggy, or nokhwa is confused. Try running it through yuyv and see what happens.
If issue persists please open a new issue @bookshiyi
Hi there 👋
I'm trying to use
nokhwa
(built from this main branch) to show my webcam video in an iced application. I am pretty new to this ecosystem, so sorry if this is a dumb question, but how do I use the camera output with the image widget of iced.My application looks like this:
I can contribute an example for this integration if you like :)