Closed bakercp closed 11 years ago
i have just made the gstreamer videograbber work on the panda, i needed to comment the settings for framerate in the pipeline but that was it, i'll ifdef it or something so it keeps working on the rest of platforms and upload my changes soon
awesome - would love to try it :)
with the new Poco Libs it is working for me with the PS3eye
I believe the PS3Eye supports x-raw-rgb in hardware (OF by default sets this to x-raw-yuv)
@arturoc One thing I wasn't sure how to do is use setPipeline with my own string. What I ended up doing is hijacking it by putting this line
string forcedString = "v4l2src name=video_source device=/dev/video0 ! video/x-raw-rgb,width=320,height=240,framerate=60/1 ! ffmpegcolorspace";
pipeline = forcedString;
It worked pretty well! - much faster: https://vimeo.com/54916224
What is the proper way of using setPipeline without initgrabber overwriting it?
edit: for reference this is where I see the overwrite happening: https://github.com/openFrameworks-RaspberryPi/openFrameworks/blob/develop-raspberrypi/libs/openFrameworks/video/ofGstVideoGrabber.cpp#L662
great!
the gstreamer video grabber should give priority to rgb so it doesn't need to do colorspace conversion, i'll take a look.
to setup a custom pipeline you need to use ofGstVideoUtils and call setPipeline, it's used as a videoplayer, you need to call play even for a camera so it starts the pipeline. and it doesn't have texture you need to use a custom one
mmh, i'm checking with the ps3eye and it actually is setting rgb but it drops some frames, the pipeline seems the same you are setting:
[notice] ofGstUtils: selected device: USB Camera-B4.04.27.1 [notice] ofGstUtils: selected format: 320x240 video/x-raw-rgb framerate: 125/1 [notice] gstreamer pipeline: v4l2src name=video_source device=/dev/video0 ! video/x-raw-rgb,width=320,height=240 ! ffmpegcolorspace ! appsink name=ofappsink caps="video/x-raw-rgb, depth=24, bpp=24, endianness=4321, red_mask=0xff0000, green_mask=0x00ff00, blue_mask=0x0000ff, alpha_mask=0x000000ff, width=320, height=240"
weird - here is my test code - I did put this... vidGrabber.setDesiredFrameRate(60); https://gist.github.com/4213640
yes, i'm looking and i think the problem is the framerate i had to comment the framerate settings to make it work in the pandaboard so now it's selecting the highest 125 which could be the problem
(updated title)
I previously had the wrong line number above - my line
string forcedString = "v4l2src name=video_source device=/dev/video0 ! video/x-raw-rgb,width=320,height=240,framerate=60/1 ! ffmpegcolorspace"; pipeline = forcedString;
right above:
gchar* pipeline_string =
The differences are pretty dramatic - otherwise I am getting:
[verbose] GStreamer: unhandled message from video_source [verbose] GStreamer: Got warning message from video_source
@arturoc How about an option in the ofGstVideoGrabber that allows you to override the pipeline and still use the ofVideoGrabber? It would also be useful for weird stuff like the Blackmagic Capture boxes, etc that you may want to try and use custom pipelines for.
I created a branch to demo the approach here: https://github.com/jvcleave/openFrameworks/tree/gstPipelineOverride
it doesn't seem to break the old way so it is a good comparison to use with the RPI/PS3 eye
test implementation (edit: updated without wait) https://github.com/jvcleave/openFrameworks/blob/gstPipelineOverride/apps/devApps/rpi_initOverride/src/testApp.cpp
and the ofGstVideoGrabber changes: https://github.com/jvcleave/openFrameworks/commit/fc8ddf7316a773e847d9908082aa48b25b6795cb
@jvcleave curious why you have to delay for so long to init the cam? is that an odd raspberry pi thing? is the init non-blocking or something?
@bakercp When we first started I was having weird issues if certain code ran before the screen/window initialized it was carried over from my old example. I just removed it and it works fine.
Nice!
For simplicity, perhaps you could get rid of the doOverridePipeline
bool and make init the customPipeline
to an empty string. Then have a setter like void setCustomPipeline(const string& pipeline);
Then if it customPipeline
gets set to something with length > 0 then you could probably even skip all of that other pipeline_string
generation code above and jsut set pipeline_string = customPipeline;
edited above ...
yeah - I agree. It def needs another pass but just wanted to demonstrate the concept.
I thought I could do something like ofGstVideoGrabber::initGrabberWithPipeline(w, h, pipeline)
but ofVideoGrabber is the one that calls ofGstVideoGrabber::initGrabber
yes the problem with this is that ofVideoGrabber will do more or less the same as ofGstVideoUtils, it doesn't have a texture and the only code it adds is the v4l2 device detection, everything else is just a wrapper to ofGstVideoUtils so if you need a custom pipeline i think it's better to just use ofGstVideoUtils for video or ofGstUtils if it's audio network or anything else that doesn't need pixels
I see what you are saying - I guess the ofGstVideoUtils being a grabber is a bit weird but Gst is a bit weird :)
Just tried this with the PS3Eye and it seems to work just as well
ofGstVideoUtils videoUtils;
ofTexture videoTexture;
int camWidth;
int camHeight;
//--------------------------------------------------------------
void testApp::setup(){
ofSetLogLevel(OF_LOG_VERBOSE);
camWidth = 320;
camHeight = 240;
string pipeline = "v4l2src name=video_source device=/dev/video0 ! video/x-raw-rgb,width=320,height=240,framerate=60/1 ! ffmpegcolorspace";
videoTexture.allocate(camWidth, camHeight, GL_RGB);
bool didStart = videoUtils.setPipeline(pipeline, 24, false, camWidth, camHeight);
videoUtils.play();
}
//--------------------------------------------------------------
void testApp::update(){
ofBackground(100,100,100);
videoUtils.update();
if (videoUtils.isFrameNew())
{
videoTexture.loadData(videoUtils.getPixels(), camWidth, camHeight, GL_RGB);
}
}
//--------------------------------------------------------------
void testApp::draw(){
videoTexture.draw(20, 20, camWidth, camHeight);
}
btw why do you want to set a custom pipeline in this case? that's the same pipeline the videograbber should be setting, you just need to call setDesiredFramerate before init
On 12/13/2012 04:46 PM, Jason Van Cleave wrote:
I see what you are saying - I guess the ofGstVideoUtils being a grabber is a bit weird but Gst is a bit weird :)
Just tried this with the PS3Eye and it seems to work just as well
ofGstVideoUtils videoUtils; ofTexture videoTexture; int camWidth; int camHeight;
//-------------------------------------------------------------- void testApp::setup(){ ofSetLogLevel(OF_LOG_VERBOSE); camWidth = 320; camHeight = 240;
string pipeline = "v4l2src name=video_source device=/dev/video0 ! video/x-raw-rgb,width=320,height=240,framerate=60/1 ! ffmpegcolorspace"; videoTexture.allocate(camWidth, camHeight, GL_RGB); bool didStart = videoUtils.setPipeline(pipeline, 24, false, camWidth, camHeight); videoUtils.play();
}
//-------------------------------------------------------------- void testApp::update(){ ofBackground(100,100,100); videoUtils.update(); if (videoUtils.isFrameNew()) { videoTexture.loadData(videoUtils.getPixels(), camWidth, camHeight, GL_RGB); }
}
//-------------------------------------------------------------- void testApp::draw(){ videoTexture.draw(20, 20, camWidth, camHeight); }
— Reply to this email directly or view it on GitHub https://github.com/openFrameworks-RaspberryPi/openFrameworks/issues/28#issuecomment-11339049.
I think you were getting different results but I get the yuv pipeline by default and it drops frames
[notice] gstreamer pipeline: v4l2src name=video_source device=/dev/video0 ! video/x-raw-yuv,width=320,height=240 ! ffmpegcolorspace ! appsink name=ofappsink caps="video/x-raw-rgb, depth=24, bpp=24, endianness=4321, red_mask=0xff0000, green_mask=0x00ff00, blue_mask=0x0000ff, alpha_mask=0x000000ff, width=320, height=240"
I also get the messages
[verbose] GStreamer: Got warning message from video_source [verbose] GStreamer: unhandled message from video_source
What's the latest on this issue? ofVideoGrabber
works out of the box for me with a PSeye and the current develop-raspberrypi
branch. It's a little slow, but very usable.
It works - I say we close it and open new issues for specific problems
Perfect.
Some inspiration from @jvcleave
https://vimeo.com/53996419