Closed mihirbala closed 2 years ago
I'm guessing it's probably just a network issue. You probably want to use TCP instead of UDP, but in any case, check the log and see what it says.
I tried to use TCP but it doesn't seem to work. This particular RTSP stream seems to only be viewable through UDP (I have tried VLC, ffplay, MPV and other programs with TCP and all refuse to open the stream). As for the log, I read through previous GitHub issues but I'm not exactly sure how to enable it. Do we call FFMPEGLogCallback.set()
in a callback function we define? I'm a bit confused as to how we can view those messages. Thanks again for your help.
If you're not seeing any messages on the console, make sure that FFmpegLogCallback.set() has been called, yes.
I set the log callback and dumped the whole log. Before I get the error avformat_open_input() error -110: Could not open input "rtsp://192.168.42.1/live". (Has setFormat() been called?)
, I get several other system errors of the form: Error: [tcp @ tcp://192.168.42.1:554?timeout=0 failed: Connection timed out]
. I'm not seeing any other log data that could be from FFmpeg. If these are originating from the FFMPEGFrameGrabber, then this would imply that it is trying to use TCP under the covers to connect which as I mentioned earlier, doesn't seem to work with this RTSP link. Do you think this could be the issue?
[EDIT]: I think there might be an issue with the underlying networking between the RTSP source and the watch. I will update this post once I've figured out the problem. TLDR may not be an issue with FFMPEGFrameGrabber at all.
Turns out the Wifi connection between the watch and the RTSP source was buggy. I tried with a fresh watch and the FFMPEGFrameGrabber works like a charm. Thanks for your help!
Hello,
I am trying to read an RTSP UDP stream frame by frame on a WearOS device, namely the Samsung Galaxy Watch 4. I started by implementing the FFMPEGFrameGrabber in a normal android project and running on a Google Pixel 4a. It worked perfectly. Here is my code:
After I got this working, I ported the app to WearOS. Everything built just fine. However, when I run the app, I am unable to open the RTSP stream. In particular, this piece of the code
try { grabber!!.start() } catch (e: FrameGrabber.Exception) { e.message?.let { Log.d(TAG, it) } grabber!!.release() }
throws the exception "Could not open input "rtsp://192.168.42.1/live". (Has setFormat() been called?)". I tried adding asetFormat("rtsp")
call during the grabber initialization to no avail. I did some research and I found some other third party apps that claim to be able to read a UDP RTSP stream on WearOS so I'm fairly sure it is supported by the hardware. I just wanted to make sure that JavaCV would work in this particular use case and that It's not just me making a stupid mistake. Thanks so much in advance for your help.