processing / processing-video

GStreamer-based video library for Processing
274 stars 130 forks source link

[opengl] Cannot use capture with gstreamer pipeline #145

Closed msoula closed 4 years ago

msoula commented 4 years ago

If used with OpenGL, capture sets up a pipeline connecting gstreamer to a buffersink which is usually associated with the parent applet. In this situation, it is not possible to get captured images through pixels[] array or Capture#get() method.

From what I understand reading the source code, once Capture#read() is called the gstreamer thread takes care of calling the buffer sink methods at the right time. Logically, it is not necessary to call read() more than once. Moreover, it seems possible to replace the default buffersink (built from the parent papplet) with another by calling PGraphics#set() method.

I'm trying to create a simple library using processing-video to take snapshots when the user clicks with the mouse. I'd like to buffer the image so that it is only displayed when the user clicks. Here is a "working' example of what I'm trying to do:

public class VideoTest extends PApplet {

    Capture video;
    PGraphics buffer;

    PImage snapshot;

    public static void main(String[] args) {
        PApplet.main("VideoTest");
    }

    @Override
    public void settings() {
        size(640, 480, P2D);
    }

    @Override
    public void setup() {

        final String[] cameras = Capture.list();
        video = new Capture(this, cameras[0]);
        video.start();

        // set buffer for capture
        buffer = createGraphics(video.width, video.height, P2D);
        set(0, 0, video);

        // start read, tmp will get notified of new samples from Capture's NewSampleListener process...
        video.read();
    }

    @Override
    public void draw() {
        snapshot.loadPixels();
        image(snapshot, 0, 0);
    }

    @Override
    public void mousePressed() {
        snapshot = buffer.copy();
    }

}

Obviously nothing works. I get multiple instances of the following exception which makes me think that the gstreamer thread works as excepted but not the buffer sink.

java.lang.NullPointerException
    at processing.opengl.Texture.copyBufferFromSource(Texture.java:827)
    at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at processing.video.Capture$NewSampleListener.newSample(Capture.java:783)
    at org.freedesktop.gstreamer.elements.AppSink$2.callback(AppSink.java:232)
    at sun.reflect.GeneratedMethodAccessor5.invoke(Unknown Source)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at com.sun.jna.CallbackReference$DefaultCallbackProxy.invokeCallback(CallbackReference.java:520)
    at com.sun.jna.CallbackReference$DefaultCallbackProxy.callback(CallbackReference.java:551)

Since I'm working on a library, I cannot use captureEvent(Capture c) to get notified when new samples are available. My options are limited here but I think all I have to do is to find a way of using my pgraphics buffer correctly...

Could someone help me on this ? From what I've read so far, many people has encountered this issue and nobody really managed to fix it.

msoula commented 4 years ago

I finally got capture to work the way I wanted it to. I'll leave a working example for those who might be interested.

public class VideoTest extends PApplet {

    Capture video;
    PGraphics buffer;

    PImage snapshot;

    public static void main(String[] args) {
        PApplet.main("VideoTest");
    }

    @Override
    public void settings() {
        size(640, 480, P2D);
    }

    boolean done;

    @Override
    public void setup() {

        final String[] cameras = Capture.list();
        video = new Capture(this, cameras[1]);
        video.start();

        // set the buffer that will be used for capture as buffer sink
        buffer = createGraphics(video.width, video.height, P2D);
        buffer.set(0, 0, video);

        // start read, buffer will get new samples from GStreamer thread as soon as read() is called
        while (!video.available()) {
            try { Thread.sleep(100); } catch (final InterruptedException ignore) {}
        }
        video.read();

        // Get the texture that is being filled with new samples from gstreamer
        final Texture tex = ((Texture) buffer.getCache(video));
        // We have to sort of initialize texture buffers to prevent exceptions from gstreamer
        while (!tex.hasBuffers()) {
            try { Thread.sleep(10); } catch (final InterruptedException ignore) {}
        }
        tex.getBufferPixels(new int[video.width*video.height]);

    }

    @Override
    public void draw() {
        if (null != snapshot) {
            // show snapshot
            snapshot.loadPixels();
            set(0, 0, snapshot);
        }
    }

    @Override
    public void mousePressed() {
        // load pixels from buffer sink into video's PImage pixels array
        video.loadPixels();
        // copy those pixels into out tmp image buffer
        snapshot = video.copy();
    }

}

The important thing here is to initialize the buffer's texture in charge of collecting the samples from the gstreamer pipeline, then load the pixels at the right time to copy them in a PImage.