This repo's objective is providing something like Web Cam server on the most popular Raspberry PI hardware. By integrating [WebRTC](https://webrtc.org/native-code/) and Raspberry PI, we can stream the Raspberry camera feed to browser or native client which talks WebRTC.
is it possible to get the raw image buffer after it's captured from raspivid, before it's sent to the encoder, to perform extra image processing? e.g. face detection and other 'server side' image processing. of course i can get the image on the other peer, on the browser but want to allow this processing on the pi
i'm aware the captured image resolution may change on the fly following network condition changes
Hi @kclyu,
really appreciate your great work here!
is it possible to get the raw image buffer after it's captured from raspivid, before it's sent to the encoder, to perform extra image processing? e.g. face detection and other 'server side' image processing. of course i can get the image on the other peer, on the browser but want to allow this processing on the pi i'm aware the captured image resolution may change on the fly following network condition changes
Thanks a lot!
Uzi