ipartola / hawkeye

An simple and fast USB webcam MJPEG streaming server.
Other
206 stars 28 forks source link

Server supplies each frame multiple times #11

Closed mberntsen closed 6 years ago

mberntsen commented 7 years ago

I'm running the software on ubuntu and when I open the stream and save it to file (for example with wget), the file grows very very quickly, and after splitting on the boundries with csplit, each frame is received 5 to 15 times. also viewing the stream in firefox generates an equally large datastream (6 MB/s)

still investigating the cause...

mberntsen commented 7 years ago

problem lies here (snippet): if (c->current_frame < c->fb->current_frame || (c->current_frame == c->fb->current_frame && c->current_frame_pos < f->data_len)) {

        len = client_write(c, &f->data[c->current_frame_pos], f->data_len - c->current_frame_pos);

        c->current_frame_pos += len;

        if (c->current_frame_pos == f->data_len) {
            // Catch them up to the latest frame
            c->current_frame = c->fb->current_frame;
            c->current_frame_pos = 0;
        }

}

if the current frame is written to the socket, it still resets the current_frame_pos to 0, so next time it will be written again. changing c->current_frame = c->fb->current_frame; to c->current_frame++ fixes this issue, but then the timeout is giving issues (effectively reducing the framerate by a factor of 2).

ipartola commented 7 years ago

Very good catch! I see the problem, though I'm not sure exactly how the frame rate is reduced by a factor of two. I think the solution is to only advance to the next frame if one exists, but I'll play around with it.

mberntsen commented 7 years ago

I've changed the timeout in the main loop to a fixed value of 0.01, that fixed the framerate reduction issue, what is the main reason for this timeout?

ipartola commented 7 years ago

So I have a similar fix to yours, though I am not sure which one philosophically is better. Yours simply advances the client's current frame to the next frame, and at the top of the if (c->request == REQUEST_STREAM) I check whether the frame the client is on exists and just bails if that frame is not ready yet. My fix only advances to the next frame if it exists. That philosophy stuff aside, I think they produce the same result.

The timeout is because the main loop consists of two parts: serving network requests and grabbing frames from the cameras. The timeout is meant to guarantee that we don't spend all of our time serving network requests and prioritize grabbing frames with the specified FPS. It places a maximum FPS at 20 because the min timeout is at 0.05. Basically, we never want the frame grabbing to fall behind because of slow network clients.

I am not sure exactly why the framerate drops but I see the effect. Basically my timeout for 15 FPS tends to be 0.066 or so (minus a few dozen milliseconds to grab the frame), so I'd expect the client to receive 15 FPS no problem, but in the output stream I am clearly seeing only 7-8 FPS, which is concerning. I'd like to understand why that is before re-working how this bit.

ipartola commented 7 years ago

I think I've got it. Instead of letting serve_clients() run with a while loop until the timeout runs out, I'm just letting it run through once. This should be significantly more efficient. Give this a try and see if it works for you. If so, I'll push this to master and push out new builds since this is a pretty major bug.

ipartola commented 6 years ago

I released the fix in version 0.6.