QuantumEntangledAndy / neolink

An RTSP bridge to Reolink IP cameras
GNU Affero General Public License v3.0
354 stars 49 forks source link

Request still JPEG image #28

Closed dkerr64 closed 1 year ago

dkerr64 commented 1 year ago

I would like to be able to capture a still JPEG image from Reolink battery cameras (specifically Argus 3 Pro).

With wired Reolink cameras I can connect to the IP address and capture an image. For example...

curl -s -m 15 http://192.168.1.111/cgi-bin/api.cgi\?cmd=Snap\&channel=0\&rs=randomstr\&user=admin\&password=password -o image.jpeg

I would like to be able to do the same with the Argus 3 Pro... without keeping the camera on-and-live-streaming all the time (to minimize battery drain).

Is this possible?

QuantumEntangledAndy commented 1 year ago

Curently no. It's on the feature list and if you can code rust it's not too hard to do. The camera spits out h264/5 frames so you login start the stream wait for an IFrame convert to a jpg and logout again. The difficulty lies in the conversion from IFrame to jpg which we would have to delegate to gstreamer.

dkerr64 commented 1 year ago

Thanks for the update. I have not programmed in rust before so it would take some time for me to figure things out.

dkerr64 commented 1 year ago

Okay, so I am reading up on rust. I have it installed, I cloned your package, I have it building. Next step is for me to figure out all the logic in this application. Also need to decide whether requesting a JPEG needs to be its own completely separate code path (ie, start neolink with 'jpeg' parameter) or to camp on to existing rtsp code. And should it listen on its own port or accept requests on the same port as rtsp request would go to.

I don't know how far I will go with this but am looking into what might be required.

Thx.

QuantumEntangledAndy commented 1 year ago

I'm working on a major rewrite of the code to reduce latency, improve battery camera support and support cellular cameras. This will likely effect how streams are read since I am swapping to an async model.

The enthusiasm is nice though so if you want to try it please do. It just might need some more changes after I've finished my adjustments to make it work with the new system. I'll help with that bit too if you've got a PR ready by then.

In terms of adding this funcationalty I recommend you copy crates/core/bc_protocol/stream.rs to crates/core/bc_protocol/still.rs adjust it to return a single IFrame instead of any BCMedia as a simple starting point.

dkerr64 commented 1 year ago

Don't worry about me. It is going to take me some time to get up to speed on both rust and the BC camera protocol. So I don't expect to be contributing anything for a while. I'm really grateful that I found this project and that you are actively working on improving it.

FYI, here is what I am trying to do. I have several PoE ethernet connected Reolink cameras and created a web page that shows the current image from all. I experimented with both rtsp and jpeg. Both were slow and unreliable... I wanted my page to flash up immediately. The solution I have is a script running on a server that pulls a JPEG from each camera once a minute, those are saved in a Ramdisk. When the webpage is opened it pulls those saved images so the page comes up instantly with images that are no more than one minute old... and my script then switches to pulling a JPEG every couple of seconds until a timeout period (say 2 minutes), so for those couple of minutes I have pseudo realtime updates of the JPEG. It has proved much better than trying to pull an rtsp video or jpeg directly from the cameras.

Now I have added a battery powered WiFi camera so I want to pull a JPEG from it at regular intervals. Maybe not once a minute... I'll fine tune that based on battery life as it will have to run up to 16 hours between charges.

Thanks.

dkerr64 commented 1 year ago

I think the answer may be in src/rtsp/gst.rs where the stream_recv function detects an Iframe and pipes it into gstreamer. That function is called from src/rtsp/states/streaming.rs.

So, thinking out loud, a possible solution would be to test for a jpeg flag in gst.rs and then change the processing to convert the Iframe into a jpeg, save it, then abort the video streaming from the camera.

There could be additional code (I don't know where) that would restart the streaming again some number of seconds later to capture another image. Maybe integrated with your "paused" function... un-pause after n seconds.

It looks like gstreamer can convert from video to jpeg, but ffmpeg is possibly another option.

Thoughts?

dkerr64 commented 1 year ago

I confirmed that gstreamer will output a jpeg. A simple test, from command line, based on their documentation works...

gst-launch-1.0 videotestsrc num-buffers=1 ! video/x-raw ! jpegenc ! filesink location=test.jpeg

Now I am rapidly getting out of my depth as I don't really understand the overall structure of neolink yet and rust is unfamiliar to me, so I have not figured out yet how gstreamer is initialized and called from neolink. But it looks like all the code that creates a rtsp server could be skipped and replaced by something that just outputs a jpeg file on request.

QuantumEntangledAndy commented 1 year ago

Certainly the plan seems sound. Have a look at the code in talk. Talk convert source audio aac/mp3->adpcm which the camera needs using gstreamer that will probably be a good base to start from.

dkerr64 commented 1 year ago

Progress report. For testing purposes I have inserted code into streaming.rs which tests for a jpeg flag and if true then saves data to file instead of sending it tolocked_output.stream_recv. Snip of test code looks like this...

                        for datum in data.drain(..) {
                            if jpeg {
                                match datum? {
                                    BcMedia::Iframe(payload) => {
                                        info!("Got an Iframe");
                                        file.write_all(&payload.data).expect("Unable to write file");
                                        got_iframe = true;
                                    }
                                    BcMedia::Pframe(payload) => {
                                        info!("Got a Pframe");
                                        if !got_pframe {
                                            file.write_all(&payload.data).expect("Unable to write file");
                                        }
                                        if got_iframe && !got_pframe {
                                            info!("Here 1");
                                            got_pframe = true;
                                            arc_abort_handle.abort();
                                        }
                                    }
                                    _ => {
                                        //Ignore other BcMedia
                                    }
                                }
                            } else {
                                locked_output.stream_recv(datum?)?;
                            }
                        }

The binary file I then run through gstreamer at command line with...

gst-launch-1.0 filesrc location=/tmp/test.bin ! decodebin ! jpegenc snapshot=TRUE ! filesink location=/home/David/test.jpeg

This is working.

That's it for now.

dkerr64 commented 1 year ago

I found a tip on the internet and I can simplify the above to just this...

                                match datum? {
                                    BcMedia::Iframe(payload) => {
                                        info!("Got an Iframe");
                                        let mut file = std::fs::File::create("/tmp/test.bin")?;
                                        file.write_all(&payload.data).expect("Unable to write file");
                                        // write it twice so don't get error that no end of frame
                                        file.write_all(&payload.data).expect("Unable to write file");
                                        //arc_abort_handle.abort();
                                        //break;
                                    }
                                    _ => {
                                        //Ignore other BcMedia
                                    }
                                }

There is no need to write out the following Pframe, you can write the Iframe twice and that tricks the decoder into thinking that the first frame has properly ended. There is probably a more elegant way... to write out some dummy empty frame, but this works.

Iframes are coming in roughly every 3 seconds, so I can let neolink run and a saved binary is updated every 3 seconds (note, I am doing this on only one thread... I am only starting "clear" and not "Fluent" or "Balanced". Or I can abort after receiving the first Iframe.

Now that I know this works, I'm content to save these frames and do the conversion to jpeg outside of neolink (for now) because I need to turn attention to how to manage the whole process... which impact other parts of neolink.

QuantumEntangledAndy commented 1 year ago

So I think you need two Iframes because the auto detect feature in gstreamer requires at least two frames to work it out. If you do this in neolink itself we can tell gstreamer the format of the data and skip the need for this double frame.

I think I could add a command to dump a frame to jpeg but I kinda feel like you are having fun with the hacking and don't mind if you want to keep at it

QuantumEntangledAndy commented 1 year ago

p.s. Use this syntax (with the triple backtick then language name)

```rust
rust code


To do syntax highlighting in github
dkerr64 commented 1 year ago

p.s. Use this syntax (with the triple backtick then language name)

```rust
rust code


To do syntax highlighting in github

Cool, thanks for the tip.

dkerr64 commented 1 year ago

So I think you need two Iframes because the auto detect feature in gstreamer requires at least two frames to work it out. If you do this in neolink itself we can tell gstreamer the format of the data and skip the need for this double frame.

I think I could add a command to dump a frame to jpeg but I kinda feel like you are having fun with the hacking and don't mind if you want to keep at it

I am significantly slowed down by this being my first run-in with Rust. So I have put aside the actual gstreamer convert-to-jpeg part as I muddle through getting familiar with Rust. It's strict typing, unfamiliar error handling, move/borrow for variables and everything else is taking some getting used to. But yes, this is useful learning for me so I will keep at it as time permits. Thank you.

Checksum commented 1 year ago

@dkerr64 may not suite your need, but I’m planning to get periodic still images through Scrypted . I was previously able to add Neolink as a RTSP device and view the stream on scrypted, but ran into the 60s freezing bug. I’m yet to try the latest RC, but will update once I get the chance.

dkerr64 commented 1 year ago

@Checksum thanks for the pointer. The Scrypted plugin for Reolink is using the published Reolink API, which works with their wired cameras. I have a battery powered WiFi camera. Reolink do not permit their API on these cameras, on the grounds that being battery powered they need to be carefully power managed by Reolink. E.g. they do not record continuously to SD card nor stream continuously.

Neolink has reverse engineered access to these cameras allowing for a rtsp stream. But a continuous rtsp stream is not a good idea because of the battery drain issue. See my comment in #30 for my suggestion on how to manage it.

Checksum commented 1 year ago

@dkerr64 I have a wireless Reolink as a well and didn’t use the Scrypted Reolink plugin. I went with the RTSP device instead and added the Neolink RTSP endpoint. Scrypted lets you also script actions, so im hoping to use that for periodic tasks (haven’t validated though).

The big question is if the camera battery is Preserved!

QuantumEntangledAndy commented 1 year ago

I have a basic command that will dump an image on command in #32 I think it does not fulfil all your needs since you also want to do wakeups from pause etc but it may be useful for you to see how to convert to jpeg in app that you can work up into something more complete

dkerr64 commented 1 year ago

Thanks for the basic framework. It is not working for me but I can start debugging it. A zero-length file is created and streaming from the camera starts, but after a while it just hangs, all log messages stop. Tail end of log is...

[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.RecievedPacker: NewData: 248
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.RecievedPacker: NewData: 249
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.RecievedPacker: NewData: 250
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.RecievedPacker: NewData: 251
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.RecievedPacker: NewData: 252
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.RecievedPacker: NewData: 253
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::bcconn] occ.full: 100/100
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.State: Normal
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.RecievedPacket: Resend
[2023-02-24T02:17:03Z TRACE neolink_core::bc_protocol::connection::udpsource] UDPSource.RecievedPacket: QueueingAck
[2023-02-24T02:17:03Z TRACE mio::poll] deregistering event source from poller
^C
QuantumEntangledAndy commented 1 year ago

Ah you have full occ.full I thought that I had fixed this bug. I will check again

dkerr64 commented 1 year ago

See update I made here... https://github.com/QuantumEntangledAndy/neolink/commit/a7bd8a9abcbf9fe2f680c14caab74ff95cd60323

This captures two frames on the one connection, whereas you were redoing the connection part to get the 2nd frame. This seems to work a lot better for me.

But from time-to-time it just hangs at the end and needs Ctrl-C and the file is zero length. I cannot find any pattern.

QuantumEntangledAndy commented 1 year ago

I could make it so that we keep sending frames until gstreamer has enough data to make then quit. Just need to set up the right channels for the data

dkerr64 commented 1 year ago

I was thinking something similar overnight... if it needs more than 2 frames to create the jpeg then give it more frames!

Is it possible to know from gstreamer when to stop?

In the meantime I modified the code to accept --frames=3 on the command line, so can set the number of frames you want to capture. It defaults to 3. However if I ask for 10 or more then I get an error

Error: Failed to send buffer: SendError(Data([0, 0, 0, 1,....

so I think we get a buffer overflow or something? However, it does seem to work more reliably with multiple frames. You can see this here... https://github.com/QuantumEntangledAndy/neolink/commit/95e5f952ff49614a0358a4d2a6459231182de801

But as it's still not 100% reliable I also decided to implement a --raw option which just saves the raw video stream to file instead of sending to gstreamer. This took a lot of doing thanks to me not being experienced with Rust. I eventually got it to compile and learned a lot more about Rust in the process. So if converting to jpeg within neolink is unstable, then I can handle it outside. You can see that here (commit/diff on top of the above) https://github.com/QuantumEntangledAndy/neolink/commit/7b30ed07d6803f034a30d8afed497f8eb4a9db45

I also notice that we send an end-of-stream to gstreamer. I wonder if that is necessary given snapshot=TRUE in the pipeline will do that?

Next steps for me are to figure out how to take a periodic image (every n seconds).

Finally... the commit you pushed last night does not compile, I think you forgot to includekeepalive.rs so I could not take that into my fork.

dkerr64 commented 1 year ago

Also, tell me if you want a PR to pull my changes into your branch. And note that I have some log::info! statements which should probably change to debug, but I kept them as info for now as it makes life so much easier to not mingle them with 100's of other debug statements for now.

QuantumEntangledAndy commented 1 year ago

I'm going to make it keep pulling frames until the gstreamer stops so no need for a PR as it will be quite different

dkerr64 commented 1 year ago

Ok. I do want the --raw capability though, so try to keep that please.

Also. Observing the network traffic. The camera keeps sending frames for 10 seconds after neolink stops. Is there a way I can tell the camera to stop sending immediately?

I'm hoping to get a periodic image capture within neolink. As currently implemented I need to wrap neolink in another program which calls neolink every 'n' minutes. But it would be best if I could tell neolink to save an image periodically... go to sleep in-between. Whatever the end design, needs to be as power efficient as possible... use as little camera power, and as little server CPU as possible.

Thanks David

QuantumEntangledAndy commented 1 year ago

I think maybe raw should be another command line option. Since you are effectively saving the h264/5 stream to disk.

QuantumEntangledAndy commented 1 year ago

We do send the camera stop message. But this is UDP, which is inherently lossy, so it may not have heard us. I could perhaps also send the C2D_Disc but it's not a high priorty for me right now

QuantumEntangledAndy commented 1 year ago

Anyways neolink now pulls frames until gstreamer is happy. See latest

dkerr64 commented 1 year ago

I think maybe raw should be another command line option. Since you are effectively saving the h264/5 stream to disk.

My motivation was only that converting to jpeg inside neolink seemed unreliable. However, your latest version looks to be pretty solid. It has not failed on me yet with the two cameras I am hitting with it. Even more impressive is that I can have Reolink's iPhone client streaming and it doesn't miss a beat when I pull down a JPEG.

I am seeing anything from 2 to 5 frames sent into gstreamer.

Will keep testing.

Thanks

QuantumEntangledAndy commented 1 year ago

Oh good glad it's working. Id like to merge into master then since it also has some misc bug fixes in it. That I fixed along the way.

Also about running every x seconds. If your on a Linux/macOS box I recommend cron. Windows is more difficult though and im not too familiar with any good method for that os.

dkerr64 commented 1 year ago

The only thing I would do before merging is fix all the spelling and typos. I have done this in my fork but it is no longer a clean merge. Let me sync my fork with yours, then I'll apply the spelling/typo fixes.

dkerr64 commented 1 year ago

This has fixes for spelling and typos... https://github.com/dkerr64/neolink/commit/fe0e81b6720aa15e4414c6fe1ca1b7307f3846f1

Everything else is identical between our branches. You can probably just cherrypick that one commit.

David

dkerr64 commented 1 year ago

I know there is a separate issue open for this, but I am still seeing the deserialization error...

david@neolink:~/github/neolink$ ~/github/neolink/target/debug/neolink image --config=/home/david/neolink.toml --file-path=/home/david/image.jpeg driveway
[2023-02-26T14:37:57Z INFO  neolink] Neolink 0.5.2 (unknown commit) debug
[2023-02-26T14:37:57Z INFO  neolink::utils] driveway: Connecting to camera at UID: <redacted>
[2023-02-26T14:38:01Z INFO  neolink_core::bc_protocol] Local discovery success <redacted> at 192.168.21.38:48568
[2023-02-26T14:38:01Z INFO  neolink::utils] driveway: Logging in
[2023-02-26T14:38:02Z INFO  neolink::utils] driveway: Connected and logged in
[2023-02-26T14:38:13Z ERROR neolink_core::bc_protocol::connection::bcconn] Deserialization error: Io(Custom { kind: Other, error: CameraTerminate })

The reason I mention it here is that in the case of image capture, I don't think neolink should attempt to recover. It should exit with an error rc. Instead it hangs and I have to ctrl-C.

Above was all local discovery.

David

dkerr64 commented 1 year ago

I think we also need to add a timeout, because sometimes it just sits and hangs without receiving a stream. This is not unique to neolink... I see it on the official iPhone app as well. But for this use case I think we need a --timeout=60 command line option so we don't wait indefinitely.

Needed so that I can call neolink periodically to capture an image and know that in error situations it will return to me with error rc.

Thanks

dkerr64 commented 1 year ago

It is hanging inside the stream_data.get_data().await?? so I think timeout needs to be part of the tokio network layer?

Again, this is caused by something wrong at the camera end... my iPhone app also won't start streaming (it may be temperature related as we were way below zero overnight). But neolink should be a bit more resilient.

dkerr64 commented 1 year ago

As a workaround to hangs / errors I can wrap neolink with the linux command timeout. It returns rc=124 if the command (in this case neolink) is terminated for timeout. So in a script I can test for 0, 1 (success/fail from neolink) and 124 (timeout) and continue on my merry way.

Bottom line... I think you should merge into master. The problems I raise here are not unique to 'image'

QuantumEntangledAndy commented 1 year ago

Gonna close this issue then since its merged, in hopes we don't stray off topic too much. Just open up any new issues you feel are relevant