Closed sajmons closed 1 year ago
Live video would be a nice thing for moon, planets and sun. The indi_simulator_ccd has a tab for video streaming. But I could not find a description for the protocol. Interesting would be any information about:
When I find these information I can check if an implementation in the driver is possible. For video streaming I do not like to pass the data through Python. A direct support by libcamera would be desired. This is already implemented in libcamera for some data formats.
Currently I'm using these two commands for streaming:
Raspberry PI terminal command:
libcamera-vid -t 0 --inline --nopreview -o - | cvlc stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8080}' :demux=h264
Stream player VLC or any other player capable of receiving RTSP stream:
rtsp://<ip-addr-of-server>:8080
Picamera2 library have some examples (https://github.com/raspberrypi/picamera2/tree/main/examples) how to handle streaming. But I'm sure that you are familiar with those.
Thank you. My problem is not to make a stream with libcamera or plibcamera2. The issue is to get transfer this stream conform to the INDI protocol.
In the meantime I found this documentation https://docs.indilib.org/drivers/binary-transfers.html which still leaves many questions. But what I understand is that he traditional stream transfer needs base64 encoding which takes long time on a Raspberry Pi (that's one reason why it takes so long between end of an image exposure and getting the image in EKOS, for video streams it will be too slow for a usable frame rate).
There is a "fast BLOBs" feature which does not require base64 encoding in the driver. That is promising. I see in the driver/server conversation that the CCD Simulator uses pingRequest and pingReply, but it is still not clear to me how it works. Documentation is not clear here and likely it will be needed to reverse engineer the INDI source code. This is definitely a long term task and beyond my resources.
Sorry, I will not be able to implement streaming in near future.
I understand. I will close this issue for now. Maybe you can just prepare entring point where this should be implemented in your driver, so that someone else could do it?
Any hope for live video support implementation? I'm wandering if this would be useful for capturing moon and planetary videos?