sarxos / webcam-capture

The goal of this project is to allow integrated or USB-connected webcams to be accessed directly from Java. Using provided libraries users are able to read camera images and detect motion. Main project consist of several sub projects - the root one, which contains required classes, build-in webcam driver compatible with Windows, Linux and Mac OS, which can stream images as fast as your camera can serve them (up to 50 FPS). Main project can be used standalone, but user is able to replace build-in driver with different one - such as OpenIMAJ, GStreamer, V4L4j, JMF, LTI-CIVIL, FMJ, etc.
http://webcam-capture.sarxos.pl
MIT License
2.26k stars 1.11k forks source link

Q: Not an issue, but a question... #614

Open neilyoung opened 6 years ago

neilyoung commented 6 years ago

Hi sarxos, I'm having here an H.264 stream HD. The source is Wifibroadcast, btw. But as you know I need to work with the frame buffer bytes.

There is a receipt on their website, that it should be possible to display the H.264 stream like so on a Linux machine:

https://befinitiv.wordpress.com/wifibroadcast-analog-like-transmission-of-live-video-data/wifibroadcast-fpv-manual-setup/

You can also watch the stream on a normal GNU/Linux PC using gstreamer:

#setup of wifi card is identical to that of the PI (see above)
 sudo ./rx -b 8 -r 4 -f 1024 wlan0 | gst-launch-1.0 -v fdsrc ! h264parse ! avdec_h264 !  xvimagesink sync=false

Now I'm wondering, if there is a way to feed the H.264 directly to IPCam. Is that possible? Or do I have to re-stream as MJPEG?

Regards

neilyoung commented 6 years ago

Ok, another approach:

Having an H.264 video stream. Coming from Raspberry Pi. I'm able to decode and display it on a Mac like so (after having installed the gstreamer-1.0 package from here https://gstreamer.freedesktop.org/data/pkg/osx/1.12.4/)

/Library/Frameworks/GStreamer.framework/Commands/gst-launch-1.0 -v tcpclientsrc h
ost=192.168.188.57 port=5000 ! gdpdepay ! rtph264depay ! avdec_h264 ! videoconvert ! videoflip video-direction=horiz ! osxvideosink sync=false

Is there any known way how to feed this into one of your capture drivers?

EDIT: Currently seeing there is a GStreamerDriver available. Would that work with GStreamer 1.0 too?

sarxos commented 6 years ago

The only way IMO is to convert h264 to MJPEG and stream it to local port. Then you can connect IP camera driver to this port and get frames. The other solution is to write dedicated driver which can read h264 stream and convert to images, but I was looking into this solution many times in the past and did not found anything easy to use (but there are frameworks which can be used for this, e.g. humble-vide, jcodec). The closest one would be to use gstreamer-java, but this require gstreamer to be installed.

neilyoung commented 6 years ago

@sarxos thanks, what do you think about GStreamerDriver?

sarxos commented 6 years ago

This won't work because current GStreamerDriver impl is used to access local webcams (UVC devices), not the remote streams. It would have to be completely rewritten to work like this.

neilyoung commented 6 years ago

Ah, ok. Yes, indeed, that doesn't work.

neilyoung commented 6 years ago

OK, I managed to make the Raspi send MJPEG directly and from what I can see in the trace this is really MJPG multipart payload. The point is, there is no HTTP encapsulation, so I can only use CURL currently to get the stuff (or a TCPClient I guess, telnet should work EDIT does work).

Any chance to modify the IPCam driver so that it would be able to cope with URLs like "tcp://..."?

Regards

sarxos commented 6 years ago

Hi @neilyoung,

Can you pleases elaborate more on your use case? It may be that your problem can be solved in a different way. E.g. in webcam-capture core there is a very simple MJPEG streamer available which can do HTTP encapsulation when reading frames from UVC device.

sarxos commented 6 years ago

And alternatively you can try this one:

http://www.linux-projects.org/uv4l/tutorials/streaming-server/

neilyoung commented 6 years ago

@sarxos Sure. Thanks for the follow up.

OK, my raspi is (as reported) able to issue H.264 RTP. We discussed the ways to get that into Java earlier. You recommended to re-stream H.264 as MJPEG locally.

Meanwhile I tried to let the raspi stream MJPG directly, and indeed: These two lines on raspi and Mac client open up a window, displaying video. Very fast and with an unbelievable low latency (close to what Wifibroadcast achieves).

Raspi

raspivid -t 0 -w 1080 -h 720 -fps 48 -b 10000000 -o - | gst-launch-1.0 -v fdsrc ! decodebin name=dec ! videoconvert ! jpegenc ! multipartmux ! tcpserversink host=0.0.0.0 port=5000

Mac OS

/Library/Frameworks/GStreamer.framework/Commands/gst-launch-1.0 -v tcpclientsrc host=192.168.188.63 port=5000 ! multipartdemux ! jpegdec ! videoconvert ! osxvideosink sync=false

I have had a look in what is going on at network level, and indeed - there is MJPEG flowing from raspi to Mac OS.


--ThisRandomString
Content-Type: image/jpeg
Content-Length: 15683

......JFIF.............C.........................   ....................!........."$".$.......C............................................................................".....................................
.....................}........!1A..Qa."q.2....#B...R..$3br.
.....%&'()*456789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz.......................................................................................................
.....................w.......!1..AQ.aq."2...B....   #3R..br.
.$4.%.....&'()*56789:CDEFGHIJSTUVWXYZcdefghijstuvwxyz....................................................................................?....s...7.......7...}....CG..I.J8$v.
...^.....:....6.....y=....b...G.N^x.A.u."QC................U.q.i.h\...a.w.F?
o..w......T...T..
.{?.7.....,.n...

The issue now is, that the Raspi doesn't provide an HTTP server, so I can't configure for instance kind of this URL to IPCamDriver, because there is nothing:

http://ip_of_raspi:5000 But what I'm able to do is to use curl or telnet

curl ip_of_raspi:5000

telnet ip_of_raspi 5000

and both start to swallow whatever comes from remote.

Now I thought: If I could use IPCamDriver now with either "tcp://" or "udp://" instead of "http(s)://", then IPCamDriver could choose the weapons to be used by examining the protocol prefix.

So it wouldn't use an HTTPClient, but a TCPClient (or whatever is suitable) instead.

BTW: Couldn't make uv4l run with an USB cam, didn't try with native Raspicam (too frustrated). But GStreamer is bummer...

sarxos commented 6 years ago

@neilyoung, if removing multipartmux from the pipeline is possible, then I could try to use standard Socket to feed MPJEG directly to IpCamMjpegStream. Right now it can be pretty hard to demux from MIME multipart, but skipping it is doable imo.

sarxos commented 6 years ago

@neilyoung, right now I'm a little bit busy with something else, but later today I will try to get this to work. I'm not sure if I will do this in IP camera driver, or create separate driver for this. Will see :)

neilyoung commented 6 years ago

@sarxos Thanks. Don't bother with this right now. I will try to find other ways. Thanks

neilyoung commented 6 years ago

@sarxos OK, disregard. I have a solution. Tried u4vl again and it immediately worked out of the box. I tried that before, but with an USB cam and had troubles only.

sarxos commented 6 years ago

@neilyoung, it's great to hear that :) Ok then, I will drop TCP-based IP camera driver for now.

sarxos commented 6 years ago

Hi @neilyoung,

I could not hold back and created MJPEG driver which can be used to read from TCP socket.

https://github.com/sarxos/webcam-capture/tree/master/webcam-capture-drivers/driver-mjpeg

This is how it can be used:

import javax.swing.JFrame;

import com.github.sarxos.webcam.Webcam;
import com.github.sarxos.webcam.WebcamPanel;
import com.github.sarxos.webcam.ds.mjpeg.MjpegCaptureDriver;

public class WebcamPanelExample {

    static {
        Webcam.setDriver(new MjpegCaptureDriver()
            .withUri("tcp://127.0.0.1:5000")
            .withUri("tcp://192.168.1.12:5000"));
    }

    public static void main(String[] args) throws InterruptedException {

        // $ gst-launch-1.0 -v v4l2src device=/dev/video0 ! capsfilter caps="video/x-raw, width=320,
        // height=240" ! decodebin name=dec ! videoconvert ! jpegenc ! multipartmux ! tcpserversink
        // host=0.0.0.0 port=5000

        final Webcam webcam = Webcam.getDefault();

        final WebcamPanel panel = new WebcamPanel(webcam);
        panel.setFPSDisplayed(true);
        panel.setImageSizeDisplayed(true);
        panel.setMirrored(true);

        final JFrame window = new JFrame("MJPEG Streaming From TCP Socket");
        window.add(panel);
        window.setResizable(true);
        window.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE);
        window.pack();
        window.setVisible(true);
    }
}
neilyoung commented 6 years ago

@sarxos Oh, how cool is that...Thanks a lot! Will give it a try, definitely. Will report. As I can see you are able to cope with the Mime encaps? Cool...

Is there in general also such a way to decode H.264 RTP?

Could be generated like so:

gst-launch-1.0 -v v4l2src device=/dev/video0 ! 'video/x-raw,width=640,height=480,framerate=30/1' ! x264enc tune=zerolatency byte-stream=true ! rtph264pay ! gdppay ! tcpserversink host=0.0.0.0 port=5000 sync=false

or so

raspivid -t 0 -h 480 -w 640 -fps 48 -b 10000000 -o - | gst-launch-1.0 -v fdsrc ! h264parse ! rtph264pay config-interval=1 pt=96 ! gdppay ! tcpserversink host=0.0.0.0 port=5000 &

BTW: Don't get me wrong, this is just a question :) But maybe you have seen that already: Both streams if issued by Raspi have a very decent framerate and just a little delay, it is a real competitor to FPV solutions like Wifibroadcast (by not having so many constraints as WBC has)

sarxos commented 6 years ago

@neilyoung, yes, I initially thought it will have problems with multipartmux, but after inspecting code of MjpegInputStream I noticed that it already requires (!) stream to be MIME multipart :suspect: I wrote it so many years ago that I forget how it works.

The problem with h264 is that there is no decent decoder available in Java. The closest one is gstreamer-java, but as I said earlier - it requires GStreamer to be installed in the system. In contrast, the MJPEG driver I wrote, does not require anything to be installed.

neilyoung commented 6 years ago

@sarxos I would favour MJPEG because it is a little faster than H.264, less delay. The strange issue with this I have is, that the raspivid command runs perfectly, if I issue it from the ssh shell. But with no means I could make it produce anything else like a green CIF screen on the PC, when starting the same stuff from rc.local or crontab or whatever service after boot. I couldn't come behind that for now. At the same time the H.264 thing did run every time I rebooted the Pi.

neilyoung commented 6 years ago

@sarxos Just a short notice: Gave it a try with your little test app. Worked out of the box. Did integrate that into my app: Worked too. Great job.

How realistic is the FPS display in your test app?

Regards