mainsail-crew / moonraker-timelapse

Timelapse Plugin for moonraker
GNU General Public License v3.0
346 stars 82 forks source link

Feature Request: Using `raspistill` and `libcamera-jpeg` for High Resolution Images #105

Open michaelmyc opened 1 year ago

michaelmyc commented 1 year ago

I want to use a higher resolution camera (ArduCam 16MP) to take high-resolution images for time-lapse rather than the 1080p video I get from crowsnest. I'm forced to use crowsnest built with ayufan/camera-streamer as the old ustreamer doesn't support libcamera-based cameras, and camera-streamer limits streams to 1080p.

This means if I want to get higher resolution time-lapses, I would need to disable crowsnest and use libcamera-still (or raspistill for older cameras) to take the pictures.

I don't think there's currently any functionality like this in moonraker-timelapse and I read @FrYakaTKoP expressing interest in such a feature in #58. The python code seems simple enough and I'm willing to code it, but I want to get a greenlight from maintainers before working on it.

My proposal is to have an extra setting called "image_source" which defaults to "snapshot" (using the snapshot url in moonraker db), but can also be "libcamera" (libcamera-still) or "raspicam" (raspistill). Additional settings are an "image_width", "image_height", and "camera_timeout" for specifying the size of the image and the delay before taking the image (useful for AF cameras).

FrYakaTKoP commented 1 year ago

@michaelmyc indeed there is interest in having such feature from the community.

If you ask me about green lighting this project, sure! But also there are some points i like to address or just point out to consider:

also i like to point out you could already achieve this with the unofficial shellcode extension for klipper and just modifying the macros a bit to trigger the custom shellcode instead of the timelapse internal take frame. Not that i don't like that someone like to takle this seriously inside of timelapse, but be aware once this "simple" thing gets into main it will need a lot of support and i may only can point at you if there will be problems.

I also Tag @KwadFan (maintainer of crowsnest), we may need him further down the line since i can't help you if there is any libcamera/raspistill/crowsnest specific questions. Also maybe he has some suggestions or concerns to this project too.

KwadFan commented 1 year ago

For now, I have to talk to ayufan, the creator of the new backend in crowsnest. I have a strong feeling that it will interfere with the video stream. Also, I might be wrong here, but thats a special setup because I/we never recommend arducams due the fact its proprietary firmware and libcamera stack.

What I dont like is to setup resolutions or kind like that inside the plugin config, for already mentioned reasons from Fry.

So, please keep patient till I could talk to ayufan.

Regards Kwad

ayufan commented 1 year ago

@michaelmyc

Currently the camera-streamer allows you to define different resolution for snapshot (JPEG), stream (MJPEG) and video (WebRTC). However, the highest snapshot resolution is 1920x1920 due to hardware encoder present on Raspberry PI.

If you run camera-streamer it takes exclusive access of camera and you cannot disable it temporarily. Running libcamera from time to time (instead of continuous) is problematic as well since it has to re-focus and re-balance.

Is the 1920x1920 enough for snapshot? The thing is that if this is enough of good quality you can then pretty much fully hardware re-encode that into h264 stream next, which on raspberry pi makes it around 20x faster than doing the same on CPU where you decode JPEG and encode it into H264.

Theoretically it is possible to extend camera-streamer to do software encode of the snapshot, but for 8MP image and higher it simply becomes super slow which might affect the Klipper running in background.

michaelmyc commented 1 year ago

if i understand this correctly, you couldn't use this new mode and crowsnest on the same camera simultaneously, correct? This will be a hard task to document so every user is aware of that.

Yes. Crowsnest would not work on the time-lapse camera in this new mode. I was thinking of the simple solution of disabling crowsnest during print and reenabling it later, but it would make sense if a user has multiple cameras to do high-resolution timelapse on one while doing monitoring on another. I guess the main question would be how to handle interops with crowsnest.

i'm fine with the setting called image_source, but i think the setting about resolution shouldn't be in timelapse. On the one hand it would add confusion: "i'm using snapshot and a changing width and height doesn't change resolution" and other hand it would fit better into the centralized webcam configuration in the moonraker database or in the crowsnest config

My main intention for having separate width and height settings is that you might want a lower resolution (less CPU usage) for streaming on the camera when you don't want to do high-resolution time-lapse, but when you do want to do time-lapse, you are good to go with the high resolution settings in the timelapse config. It should eliminate the need to go to the settings every time you want to switch things up.

can you describe camera_timeout with more detail? Timeout sounds more like a it shouldn't take longe then x secs or we abort the operation. But "delay before taking the picture" sound more like to compensate latency between moving the printhead and taking the picture. There is already a 'stream_delay_compensation' in timelapse to compensate for a "slow" stream/camera when using the parking feature and also 'park_time' to keep the head longer in parking position

The main purpose here is that with the Pi Cam 3 and ArduCam IMX519, you have autofocus capabilities. So the camera would try to acquire focus when the preview starts. If we immediately snap the photo, it might not focus on where we want it to focus. If we want autofocus cameras to work, then we might need this sort of a delay. Maybe we use 'stream_delay_compensation' as an alias to 'delay_compensation' so the name fits both scenarios without breaking current configs that use 'stream_delay_compensation'?

also i like to point out you could already achieve this with the unofficial shellcode extension for klipper and just modifying the macros a bit to trigger the custom shellcode instead of the timelapse internal take frame.

Interesting. But that feels really hacky and not really portable. I'd rather we have a native solution here.

Not that i don't like that someone like to takle this seriously inside of timelapse, but be aware once this "simple" thing gets into main it will need a lot of support and i may only can point at you if there will be problems.

Happy to help maintain it and answer questions.

michaelmyc commented 1 year ago

I have a strong feeling that it will interfere with the video stream.

Yes. So we have to figure out some way of pausing a video stream.

Also, I might be wrong here, but thats a special setup because I/we never recommend arducams due the fact its proprietary firmware and libcamera stack.

The firmware is actually in mainline kernel 6.1. If you don't want autofocus, you don't need the ArduCam firmware and libcamera stuff to make it work. If you're satisfied with slow contrast detection autofocus and no continuous autofocus, you can get away with modifying the IMX519.conf file to add a contrast detection autofocus algorithm and native libcamera works (I've tested this). Phase detection autofocus and continuous autofocus are still ArduCam libcamera stack only, unfortunately, but they say they do have plans to push it to libcamera mainline.

I got my ArduCam before realizing that they're using proprietary drivers and libcamera. It's quite messy. If I could dial back time, I'd spend the extra 15 bucks and get the Pi Cam 3 instead.

What I dont like is to setup resolutions or kind like that inside the plugin config, for already mentioned reasons from Fry.

It's not perfect, I agree. However, if someone only has 1 camera, and wants to switch between stream and timelapse mode depending on the print, they would have to go to the settings to manually change the resolution of the camera, right? If we name it more explicitly and put it in a separate section in our documentation to highlight what this feature is intended for, maybe that could resolve your concerns?

michaelmyc commented 1 year ago

If you run camera-streamer it takes exclusive access of camera and you cannot disable it temporarily. Running libcamera from time to time (instead of continuous) is problematic as well since it has to re-focus and re-balance.

Yes, that's why I was mentioning a delay to let libcamera have time to adjust focus, exposure, and white balance.

Is the 1920x1920 enough for snapshot? The thing is that if this is enough of good quality you can then pretty much fully hardware re-encode that into h264 stream next, which on raspberry pi makes it around 20x faster than doing the same on CPU where you decode JPEG and encode it into H264.

If I'm making a video, I'd prefer the higher resolution (more than 1080p), but when I'm streaming to the tiny window in the Mainsail web interface, I only really need 480p to monitor if my print has become spaghetti. Does camera-streamer utilize the H264/H265 hardware encoding on video streams today? I think you can also use different sensor modes in libcamera, but I'm not sure how that's implemented in camera-streamer or how that affects CPU.

Theoretically it is possible to extend camera-streamer to do software encode of the snapshot, but for 8MP image and higher it simply becomes super slow which might affect the Klipper running in background.

That would be great, but we do need to do some testing to see how much of an impact it has on the CPU. I don't have a Pi zero 2 so I can't test it on the lower tier of recommended hardware. I only have a Pi 3B+ and a BTT CB1 mounted on a M4P board. I think I've seen CB1 having hardware decode capabilities for 4kp60 for H264 and H265. Pi 4 should have hardware decode for 4kp60 for H265 but only 1080p60 in H264. Pi 3 only has hardware decode for 1080p60 H264.

I'm happy to test on the hardware I have, but considering the price of a Pi 4, I'm not going to get one for testing. And I do have experience working with C++, so happy to delve into the coding as well if that's needed, but I'll need some time to familiarize myself with the camera-streamer codebase which is much larger and lower-level compared to moonraker-timelapse.

My main concern is that if the sensor mode of the camera needs a lot of CPU utility, we could be opening a huge can of worms in terms of overheating, bottlenecking Klipper, etc. if we decide to use snapshot as time-lapse frame source. Another issue is that utilizing full sensor might mean <10fps in the stream which makes me question if it's even worth it to have the stream at that point.

ayufan commented 1 year ago

I might disappoint you :)

Does camera-streamer utilize the H264/H265 hardware encoding on video streams today?

It does, the same for JPEG encoding. This is why there is limitation on snapshot size to be 1920x1920 max on todays RPI.

I think you can also use different sensor modes in libcamera, but I'm not sure how that's implemented in camera-streamer or how that affects CPU.

You can, and camera-streamer does rescalling (via ISP) of outputs to provide a desired resolution.

I think I've seen CB1 having hardware decode capabilities for 4kp60 for H264 and H265. Pi 4 should have hardware decode for 4kp60 for H265 but only 1080p60 in H264. Pi 3 only has hardware decode for 1080p60 H264.

it is all about the support. CB1 is bases on H616 which does not expose many of those hardware VPU via V4L2. So, it is pretty much useless.

I'm not going to get one for testing. And I do have experience working with C++, so happy to delve into the coding as well if that's needed

It will not be easy, since the problem does not lay with camera-streamer, but rather what system provides under /dev/video*, and what are max resolutions there. This is why tools/dump_cameras.sh to see what V4L2 M2M devices are available.

My main concern is that if the sensor mode of the camera needs a lot of CPU utility, we could be opening a huge can of worms in terms of overheating, bottlenecking Klipper, etc.

This is why this project currently does not implement any software functions. It is rather very expensive. By using only hardware it has to work with hardware limits, but on the other hand have superior performance.

simonvez commented 1 year ago

I recently upgraded my pi to the latest mainsail os to find that raspistill wont work anymore :( So my camera is not good for timelapse now as it crops in at 1080p. Before I was set like this: camera_raspi_options="-usestills -fps 2 -x 1920 -y 1080 -awb off --awbgainR 2.0 --awbgainB 1.2 -quality 100 -ev -5"

So there is no way to work in still mode anymore ? I was more than happy with just 2 FPS :)

simonvez commented 1 year ago

its not bad at 1640x1232 but It was way better in still mode :P

michaelmyc commented 1 year ago

@simonvez Maybe try libcamera-still. raspistill can't support the newest pi camera, so they're moving on to libcamera.

ayufan commented 1 year ago

@simonvez I think you need to explain more what is the differences between :) I think all of than can be tuned. Yeah, the default 1920x1080p might crop on some cameras, this is why you need to specify a native sensor size for now.

simonvez commented 1 year ago

the difference with the still is quite big when using still. I know that new bullseyes dont support raspistill and its been moved to libcamera-still, but I dont know how to implement that in the new mainsail/crownsnest config file. Any tip on that ? And THANKS a bunch for the help you guys rock !

ayufan commented 1 year ago

@simonvez You mean -usestills -fps 2 -x 1920 -y 1080 -awb off --awbgainR 2.0 --awbgainB 1.2 -quality 100 -ev -5? Can you give me details about your camera, and run also tools/dump_cameras.sh?

simonvez commented 1 year ago

I run a Raspi cam v2.1 with the sony IMX sensor IMX219. Its a 8MP sensor. As for the dump_camera.sh... i am not sure where to find that script

pi@vz330:~/crowsnest/tools $ ls -alh total 56K drwxr-xr-x 2 pi pi 4.0K Mar 26 08:09 . drwxr-xr-x 10 pi pi 4.0K Mar 26 08:09 .. -rwxr-xr-x 1 pi pi 7.7K Mar 26 08:09 configure.sh -rwxr-xr-x 1 pi pi 9.5K Mar 26 08:09 dev-helper.sh -rwxr-xr-x 1 pi pi 19K Mar 26 08:09 install.sh -rwxr-xr-x 1 pi pi 5.8K Mar 26 08:09 uninstall.sh pi@vz330:~/crowsnest/tools $ pwd /home/pi/crowsnest/tools

FrYakaTKoP commented 1 year ago

@simonvez one alternative would be to install the unofficial shellcode klipper extension and replace the takeframe macro with a custom shellcode to collect the stills (without using crowsnest or any other streamer). This will break timelapse a fiar bit (frame counter and stuff) but if the frames are named right you can still use timelapse render to trigger ffmpeg. I can't help you with the correct shellcode for your camera, but once you have a working one i can help you to hack the rest together (just ping me up on discord) -FrY

ayufan commented 1 year ago

@simonvez

I pushed some changes to make things more configurable in camera-streamer . You need to pull develop branch, and then configure the following options:

# Run in shell to fetch develop
cd ~/crowsnest/bin/camera-streamer
git fetch
git checkout origin/develop
make clean
make
sudo make install
# Configure crowsnest.conf
resolution: 3280x2464
max_fps: 15
custom_flags: --camera-options=ColourGains=2.0,1.2 --camera-snapshot.options=compressionquality=100

The resulting resolution should be around 1437x1080, aligned to 32x32 blocks => 1408x1056 since it wants to maintain aspect ratio of sensor capture 3280x2464 (4:3).

If you care about stabilised timelapses, like people from Octolapse you snapshot capture endpoint should look like this: http://my-printer:8080/snapshot?max_delay=0.

skyeshadow94 commented 10 months ago

Hi there following this to see if I can use camera module 3 wide at a higher res, is this correct if so could someone detail how I would go about that as currently if I up the res in .conf I get no image