ayufan / camera-streamer

High-performance low-latency camera streamer for Raspberry PI's
314 stars 47 forks source link

Feature Request - new capture snapshot endpoint to get the next frame #53

Closed FormerLurker closed 1 year ago

FormerLurker commented 1 year ago

First, very nice project! Thank you for your time and effort.

Background

I have been testing compatibility with an Octoprint plugin I wrote in preparation for people using camera-streamer within OctoPrint. My plugin creates a timelapse for 3d printers by 'posing' the 3d printer (usually by parking the printhead in a specific X/Y location), grabbing a snapshot, and then resuming the print. Here is a sample video from a long while back. This video was made using snapshots taken from an mjpg-streamer fork, which returns the next available frame when requesting a snapshot from the camera. And here is comparison (not my video), with and without Octolapse so you have a better idea what I'm trying to achieve.

The Problem

In testing camera-streamer, I noticed that the frame returned from the /snapshot endpoint can be from up to one second in the past using my settings(15 frames behind in my case). This means the image I am capturing is from before the printer made it to the specified X/Y position. My plugin has a 'snapshot delay' that can compensate for this, but 1000MS delay is quite large, and causes the following issues: Filament oozes from the nozzle during this time, affecting print quality negatively. Since each timelapse could contain well over 1000 snapshots, the additional delay adds to the print time. Octolapse also has a feature called 'snap to print' which requires extremely low delays in snapshot acquisition order to function properly without seriously impacting print quality.

Possible Solution

First, are there any settings tweaks that exist already for dealing with my issue? If so, this feature request may be unnecessary.

If not, I believe adding a special endpoint, like /next-snapshot or /new-snapshot or something (I'm bad at naming endpoints, lol!) that would wait until the next frame is available to return an image would solve this problem 100%. The average delay in receiving a frame in this case would be 1/FPS * .5 seconds, and would minimize the print time and the oozing issues. Alternatively, perhaps there could be a query string parameter added to an existing endpoint (like /snapshot?next-frame=true or something). Thoughts?

Assistance

I am willing to attempt to add this feature myself if you don't have the time/desire to tackle it. I looked through the code, but it will take me quite a while to become familiar enough with how it works to make any progress. I would be extremely grateful for any ideas you might have for how to accomplish this, and any direction you can give me that will save time.

I appreciate the time you spent reading through this issue, as I'm sure you are a busy person. Thank you!

kbingham commented 1 year ago

We should be able to get this synchronised in a few different ways. Will definitely be an interesting exercise, and as an existing user of octolapse, definitely something I'd be interested in helping out with! It's about time I got my prusa running libcamera anyway :-)

ayufan commented 1 year ago

In general:

You could just try to change the https://github.com/ayufan/camera-streamer/blob/master/output/output.c#L3 to be 0 instead of 1000 and see if this helps. I believe it should and should always return frame captured after the request is send. If it does, I will change it in source code.

kbingham commented 1 year ago

@ayufan do you expose the SensorTimestamp metadata in the output stream at all?

ayufan commented 1 year ago

@kbingham It is internally used, but it is not yet currently exposed. Technically the best place would be to include it as a header. for both snapshot and video.

kbingham commented 1 year ago

There's a fair bit here that we'll want to look at for things like ZSL (zero shutter lag) later (or sooner hopefully) in the libcamera internals that may be relevant here. But if octolapse also "knows" when it wants to take a snapshot in advance it makes me wonder if we might want a way to schedule a request on the still stream efficiently in libcamera. I'll pose it to the rest of the libcamera team. It's a slightly different use case that I don't think we've thought of yet.

ayufan commented 1 year ago

@kbingham Yes, currently it is not possible to schedule exact time. The only way is to enqueue and wait for buffer to be dequeued. So, basically the code that drops stale frames will have to cycle through all enqueued buffers first before it finds one that is recent.

I do track enqueue time and also SensorTimestamp to handle this: https://github.com/ayufan/camera-streamer/blob/master/device/libcamera/buffer.cc#L146

ayufan commented 1 year ago

Ah, also the CAPTURE_TIMEOUT_US might also be reduced: https://github.com/ayufan/camera-streamer/blob/master/device/links.c#L255.

FormerLurker commented 1 year ago

@kbingham and @ayufan, thank you so much for taking a look! I'm currently studying your code so that I can have some hope at following the discussion.

as an existing user of octolapse

Glad you got some use out of it!

But if octolapse also "knows" when it wants to take a snapshot in advance it makes me wonder if we might want a way to schedule a request on the still stream efficiently in libcamera.

Unfortunately, I don't know exactly when I want the frame because I don't know exactly when the printer will be in position. This could be estimated, but probably not exactly calculated. Under the hood, Octolapse sends gcodes to prepare to take a snapshot, then sends an M400 command (wait for moves to finish) + an M115 command (report the current position). Once it gets a response from the M115 command, we know the printer is in position and it's time to capture a snapshot. The delay depends on how long the moves take to complete and how many gcodes were already in the printer's buffer (this is configurable). In my experience it can be anywhere from a few MS to 10s of seconds.

there's already old frame detection

I will continue to read the code, and will hopefully then not be so ignorant of the internals. Your links were very informative, so thank you for that!

ayufan commented 1 year ago

@FormerLurker I think it is expected that if you fine tune those two params to closer to 100ms you should expect the /snapshot to provide up-to date frame. Maybe the /snapshot needs to be improved to require that frame received is newer than request. Seems doable, but maybe just fixing the timeouts first one to 0 and second (this might be tricky, as it does measure the whole pipeline, and in some cases it does take closer to 250ms...)...

kbingham commented 1 year ago

Thanks for the insight. If we can't predict exactly when to take a snapshot then indeed we just have to work towards lowering all the latency from the point we "know" the frame is suitable.

FormerLurker commented 1 year ago

Maybe the /snapshot needs to be improved to require that frame received is newer than request.

Even an alternative endpoint that does this would be fine, as that would allow the current /snapshot request to return more quickly for cases where a new frame isn't required. Ideally, Octolapse will always receive a new frame, never an old one, as that essentially removes the need to build in any delay into Octolapse. If I know that I will get the most recent frame, I will still need to build in a delay of at least 1/FPS seconds before making a request. If I can count on getting the next frame, the average response (ignoring processing and round trip time) time would be 1/FPS*0.5 seconds, which is the absolute minimum it could be, and would be fantastic.

ayufan commented 1 year ago

@FormerLurker @kbingham Please check this in develop: https://github.com/ayufan/camera-streamer/commit/9224db147e84b7505e541703c6a28c715573e218.

It ensures that captured snapshot of the image is with a maximum predictable delay (default 300ms), but can be made to capture at this particular moment with ?max_delay=0.

FormerLurker commented 1 year ago

Building and installing now, THANK YOU!!! I will report back after extensive testing.

Elitesoulman commented 1 year ago

@FormerLurker If any further testing is needed I'm happy to assist. I'm using a Pi 3b+ and the Pi Cam v3. I've been following the updates closely and am very excited to see this progress so quickly.

FormerLurker commented 1 year ago

@ayufan, initial testing shows GREAT success! I was able to set my camera delay to 0MS and still got a pretty much perfect timelapse. I've only tested the usb camera so far.

Performance

The maximum acquisition time I've seen (time to get a response from the /snapshot endpoint, but before downloading the entire image) using the new max_delay=0 querystring parameter is 158MS. The min was 51MS, and the average was 107MS. At 30FPS the average expected delay between frames (with no processing time) would be 62.5MS, so this suggests it takes, on average, approximately 44.5MS for camera-streamer to process and return a frame on my pi3b (ignoring the wait time for the next frame), which is VERY impressive!

This performance is significantly better than mjpg-streamer, which was running at least 2x the acquisition time compared to this new branch. It could even be a 4x improvement, but I didn't save the old framerate (15fps vs 30fps). I will re-test later to confirm. This also ignores the default delay setting, which can now be lowered, potentially to 0.

What this means for Octolapse

In ideal cases (low mechanical vibration for both the printer and the camera) the 'camera delay' can be set to 0MS! Also, the average snapshot acquisition time has dropped (for my tests) by between 107MS and 214MS. For my very small test print of 189 layers, that shaved between 20-40 seconds from the print time (need to re-test with the old streamer for confirmation). Setting the camera delay to 0MS removed an additional 23 seconds. That may not sound like a lot, but it's a TON of time for filament to ooze from the nozzle, which should boost quality considerably, especially for users printing at very fine quality (<0.2mm layer heights).

I've spent months shaving milliseconds from the snapshot process, and this is probably the single biggest jump in performance yet, so THANK YOU VERY MUCH! I will try to capture the difference in a video (it should be obvious) that I can post to YouTube, and will link to your repo if you are comfortable with that. I'm also going to update the OctoPrint issue with my findings. I'm very excited for this to be the new default stack!

I'll continue to update this issue as I learn more. I will be busy for the next couple days, but hope to start on Sunday if possible.

@Elitesoulman, if you can test this with the pi cam, that would be fantastic! Just make sure you change your base and snapshot address in your Octolapse camera profile accordingly:

image

Also, double check the /index page to ensure you are running the new version (I had to manually stop the existing camera-streamer service, which was customized, and start the new one I built from the develop branch):

image

Elitesoulman commented 1 year ago

@FormerLurker I'm on it! I'll report back shortly.

ayufan commented 1 year ago

@FormerLurker

Interesting. The:

FormerLurker commented 1 year ago

@ayufan,

I don't think I was very clear in explaining the performance gain. I was comparing mjpg-streamer (the previous stack) to camera-streamer only, not the relative performance of /snapshot vs /snapshot?max_delay=0. Before this update, mjpg-streamer was, overall, faster from the perspective of Octolapse (how long does it take to actually get an image I can be sure is newer than the sent request). Now camera-streamer appears to be substantially faster. With the old version of camera-streamer, I was adding a delay of 125MS delay to get a future frame, though I believe I could have shaved it down by a few MS. So, from my perspective:

(Time to Acquire) = (Delay Time) + (Round Trip Time)

Delay time is now 0MS, and round trip time has NOT increased by 125MS on average, which is where the boost is coming from.

However, I was looking back on my calculations, and I appear to have used the wrong framerate in my xls sheet (I was rushing to provide results). I will update once I have a proper comparison.

Elitesoulman commented 1 year ago

I'm newer on the Github side of things, what is the best way to update camera-streamer to make sure I've got the right (develop) version? I'm happy to try anything I think I just jumped in a bit too deep.

FormerLurker commented 1 year ago

@Elitesoulman,

I basically followed the instructions in readme.md, but I will add the specifics assuming you have the rc version of OctoPi flashed (see this issue).

  1. SSH into your pi.
  2. Enter the following commands one at a time:
git clone -b develop https://github.com/ayufan-research/camera-streamer.git --recursive
apt-get -y install libavformat-dev libavutil-dev libavcodec-dev libcamera-dev liblivemedia-dev v4l-utils pkg-config xxd build-essential cmake libssl-dev

cd camera-streamer/
make
sudo make install

Next you want to stop your current service:

systemctl stop camera-streamer

If you are running another version of Octoprint running mjpg-streamer, the same instructions would work, but you have to stop mjpg-streamer instead (I believe, haven't tested):

systemctl stop webcamd.service

Now, enable and start the new service:

systemctl enable $PWD/service/camera-streamer-raspi-v3-12MP.service
systemctl start camera-streamer-raspi-v3-12MP

Note that you may have to authenticate after each of the commands above.

Now, check to make sure your camera is streaming with the new develop branch by looking at:

http://{IP_OF_PI}:8080

Make sure you see this:

image

Install Octolapse, setup your printer profile (it's a default profile, just add and import the Prusa Mk3 profile), and change your octolapse camera profile to this:

image

Elitesoulman commented 1 year ago

Thank you! I was so close, I didn't have the -b develop syntax correct. I'm updating now and will run a short test print momentarily.

FormerLurker commented 1 year ago

You might want to edit the logging profile in Octolapse, and customize it like so (making sure you clear the current logs first):

image

Then you can download the log and post to gist.github.com, dropping in a link. We can learn about the timing that way.

Elitesoulman commented 1 year ago

Alright, I've got some data. The logging settings didn't take, but I'll upload here when I have them from the next print. Linked below are the 3 comparable Octolapse timelapses I was able to get, changing the snapshot delay in Octolapse only. Max delay for the snapshot was set to 0.

0ms delay

50ms delay

125ms delay

The results for all three are miles better than what I was getting with the main branch, simply due to cutting down from >1000ms delay to get a stabilized timelapse. Blobs and even stringing are greatly reduced. The main quality issue I see here is that the v3 camera module cannot autofocus as fast as camera-streamer can take the snapshot at 0ms delay. The focus quality improves each time up to the standard delay of 125.

I am doing an overnight print tonight with a different file and have enabled logging and will report back. I will also do some testing without autofocus enabled at 0ms delay and perhaps a different mounting location for the camera.

Thank you all for your hard work, and your patience with a newbie.

ayufan commented 1 year ago

Great result. @Elitesoulman Did you run comparison test again ustreamer or mjpg-streamer?

mjpg-streamer with libcamera backend will always be slower since it does software encoding.

ayufan commented 1 year ago

@Elitesoulman It is also strongly advised to lock focus, always. With 3d printers you usually have a fixed focal point.

ayufan commented 1 year ago

Closing as resolved. It will be merged at some point into develop.

Elitesoulman commented 1 year ago

@ayufan Thanks again, I did some more testing this morning. I was able to add Afspeed=1 in

 /boot/camera-streamer/libcamera.conf

and got a much better result at 0ms delay even with autofocus.

0ms delay fast AF

@FormerLurker I'm still working on getting the proper log files for you, I'm only getting the init logs.

Current Log File

I'll continue to test and can report back here or elsewhere.

FormerLurker commented 1 year ago

@Elitesoulman, try installing Octolapse from the devel branch. I fixed a few issues in there that might be preventing logging in some cases due to messing up removing python 2 support. Feel free to create an issue on the Octolapse repo since this issue is closed.