Right now the way the Pipeline send frames to the streamer is through a queue. If I remember correctly this is taking about 2ms at 320x240 and 10ms at 640x480, which it too long.
The first thing we need to do is establish a baseline. Set the pipeline to find last years' target, and run the camera connected to a client (so streaming) for 1 min or so. Record the results from the logs.
Something else worth looking into is whether this is caused by actually putting the frame in the queue or if it's because the conversions the pipeline is doing. (It's scales the numpy array down to 320x240 and then converts it to a bytes object representing a .jpg image).
Once we establish a baseline we can move on to testing possible solutions. What I have in mind is:
Pipes. Queues in Python are built on Pipes, so Pipes can be 3x faster. (According to some Stack Overflow post - looks legit enough through). Worth a shot.
use a buffer, specifically io.BytesIO(). idk if this is a viable solution, worth a shot
If the queue isn't the issue, maybe just move the processing of the frames to the streamer
Right now the way the Pipeline send frames to the streamer is through a queue. If I remember correctly this is taking about 2ms at 320x240 and 10ms at 640x480, which it too long. The first thing we need to do is establish a baseline. Set the pipeline to find last years' target, and run the camera connected to a client (so streaming) for 1 min or so. Record the results from the logs. Something else worth looking into is whether this is caused by actually putting the frame in the queue or if it's because the conversions the pipeline is doing. (It's scales the numpy array down to 320x240 and then converts it to a bytes object representing a .jpg image). Once we establish a baseline we can move on to testing possible solutions. What I have in mind is: