Open patricksebastien opened 9 years ago
can you launch the test-pipeline (that uses to work) with forcing the caps to the very same as in the failing pipeline?
i also often that adding queue
elements sometimes simply helps.
finally, i don't see any crash here (gst-launch simply stops)
Today I'm not able to make it works with videotestsrc! Really lost here...
Linux pscbox 3.8.0-32-lowlatency #24-Ubuntu SMP PREEMPT Tue Oct 15 21:20:09 UTC 2013 x86_64 x86_64 x86_64 GNU/Linux
psc@pscbox:~$ sudo modprobe v4l2loopback
psc@pscbox:~$ gst-launch-0.10 videotestsrc ! v4l2sink device=/dev/video0
Setting pipeline to PAUSED ...
ERROR: Pipeline doesn't want to pause.
ERROR: from element /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0: Failed to query attributes of input 0 in device /dev/video0
Additional debug info:
v4l2_calls.c(142): gst_v4l2_fill_lists (): /GstPipeline:pipeline0/GstV4l2Sink:v4l2sink0:
Failed to get 0 in input enumeration for /dev/video0. (25 - Inappropriate ioctl for device)
Setting pipeline to NULL ...
Freeing pipeline ...
I compiled from master for this kernel.
/dev/video0
and /dev/video1
loopback devices?exclusive_caps=0
to the module-parameters, since GStreamer-0.10 doesn't like output-only devicesexclusive_caps
is a module parameter, you need to pass it when loading the loopback device driver:
rmmod v4l2loopback
modprobe v4l2loopback exclusive_caps=0
(its unrelated to GStreamer)
Yes! It works. For your information, it is also working when you follow this step:
As for the initial question, I tried adding ! queue ! (everywhere also) but it's still complaining. Will try to find a pipeline that works... Will post back if I succeed.
also check, whether adding a tee
element helps (as described in #83), and/or whether it has been fixed in 0.10.0
I just spent half a day. Turns out that 0.10.0 is buggy. You have to upgrade v4l2loopback to 0.12.0 for this to work.
Here is my command that finally works (I have a USB HDMI Encoder which produces mjpeg):
gst-launch-1.0 v4l2src device=/dev/video0 ! image/jpeg, width=1920, height=1080, framerate=30/1 ! tee ! v4l2sink device=/dev/video55
Hope this helps someone.
while you are upgrading to 0.12.0
, you might go for the full route and upgrade to 0.12.5
, which is the 5th bugfix (sic!) sub-release of the v4l2loopback-0.12
release
Hi, i have the similar symptom, built from source on master branch, i installed the module with _modprobe v4l2loopback exclusivecaps=0
I'm streaming a camera with
gst-launch-1.0.exe -v ksvideosrc device-name="HD USB Camera" ! "image/jpeg,width=640,height=480" ! rtpjpegpay ! udpsink host=192.168.1.103 port=5001
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstKsVideoSrc:ksvideosrc0.GstPad:src: caps = image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)151/5, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)151/5, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.200000, payload=(int)26, ssrc=(uint)1462621853, timestamp-offset=(uint)1737146780, seqnum-offset=(uint)13494
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.200000, payload=(int)26, ssrc=(uint)1462621853, timestamp-offset=(uint)1737146780, seqnum-offset=(uint)13494
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)151/5, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)151/5, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 1737226281
/GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 13509
WARNING: from element /GstPipeline:pipeline0/GstUDPSink:udpsink0: Pipeline construction is invalid, please add queues.
Additional debug info:
../libs/gst/base/gstbasesink.c(1218): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstUDPSink:udpsink0:
Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
and trying to receive the data with
gst-launch-1.0 -v udpsrc port=5001 ! application/x-rtp,encoding-name=JPEG,payload=26,framerate=30 ! rtpjpegdepay ! jpegparse ! jpegdec ! v4l2sink device=/dev/video0
Setting pipeline to PAUSED ...
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, framerate=(int)30, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, framerate=(int)30, media=(string)video, clock-rate=(int)90000
/GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480
/GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)0/1
/GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)1:4:0:0, framerate=(fraction)0/1
ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data stream error.
Additional debug info:
gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0:
streaming stopped, reason not-negotiated (-4)
Execution ended after 0:00:00.012224615
Setting pipeline to PAUSED ...
Setting pipeline to READY ...
Setting pipeline to NULL ...
Freeing pipeline ...
I'm able to receive correctly the image with autovideosink
gst-launch-1.0 udpsrc port=5001 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegparse ! jpegdec ! autovideosink
Thanks for any advice :D
pleases replace the images with text.
Hi, i have the similar symptom, built from source on master branch, i installed the module with _modprobe v4l2loopback exclusivecaps=0
I'm streaming a camera with
gst-launch-1.0.exe -v ksvideosrc device-name="HD USB Camera" ! "image/jpeg,width=640,height=480" ! rtpjpegpay ! udpsink host=192.168.1.103 port=5001
Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstKsVideoSrc:ksvideosrc0.GstPad:src: caps = image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)151/5, pixel-aspect-ratio=(fraction)1/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)151/5, pixel-aspect-ratio=(fraction)1/1 /GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.200000, payload=(int)26, ssrc=(uint)1462621853, timestamp-offset=(uint)1737146780, seqnum-offset=(uint)13494 /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)JPEG, a-framerate=(string)30.200000, payload=(int)26, ssrc=(uint)1462621853, timestamp-offset=(uint)1737146780, seqnum-offset=(uint)13494 /GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0.GstPad:sink: caps = image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)151/5, pixel-aspect-ratio=(fraction)1/1 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = image/jpeg, width=(int)640, height=(int)480, framerate=(fraction)151/5, pixel-aspect-ratio=(fraction)1/1 /GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: timestamp = 1737226281 /GstPipeline:pipeline0/GstRtpJPEGPay:rtpjpegpay0: seqnum = 13509 WARNING: from element /GstPipeline:pipeline0/GstUDPSink:udpsink0: Pipeline construction is invalid, please add queues. Additional debug info: ../libs/gst/base/gstbasesink.c(1218): gst_base_sink_query_latency (): /GstPipeline:pipeline0/GstUDPSink:udpsink0: Not enough buffering available for the processing deadline of 0:00:00.020000000, add enough queues to buffer 0:00:00.020000000 additional data. Shortening processing latency to 0:00:00.000000000.
and trying to receive the data with
gst-launch-1.0 -v udpsrc port=5001 ! application/x-rtp,encoding-name=JPEG,payload=26,framerate=30 ! rtpjpegdepay ! jpegparse ! jpegdec ! v4l2sink device=/dev/video0
Setting pipeline to PAUSED ... Pipeline is live and does not need PREROLL ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, framerate=(int)30, media=(string)video, clock-rate=(int)90000 /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:sink: caps = application/x-rtp, encoding-name=(string)JPEG, payload=(int)26, framerate=(int)30, media=(string)video, clock-rate=(int)90000 /GstPipeline:pipeline0/GstRtpJPEGDepay:rtpjpegdepay0.GstPad:src: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480 /GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:sink: caps = image/jpeg, framerate=(fraction)0/1, width=(int)640, height=(int)480 /GstPipeline:pipeline0/GstJpegParse:jpegparse0.GstPad:src: caps = image/jpeg, parsed=(boolean)true, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:sink: caps = image/jpeg, parsed=(boolean)true, format=(string)UYVY, width=(int)640, height=(int)480, framerate=(fraction)0/1 /GstPipeline:pipeline0/GstJpegDec:jpegdec0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)640, height=(int)480, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)jpeg, colorimetry=(string)1:4:0:0, framerate=(fraction)0/1 ERROR: from element /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: Internal data stream error. Additional debug info: gstbasesrc.c(3055): gst_base_src_loop (): /GstPipeline:pipeline0/GstUDPSrc:udpsrc0: streaming stopped, reason not-negotiated (-4) Execution ended after 0:00:00.012224615 Setting pipeline to PAUSED ... Setting pipeline to READY ... Setting pipeline to NULL ... Freeing pipeline ...
I'm able to receive correctly the image with autovideosink
gst-launch-1.0 udpsrc port=5001 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegparse ! jpegdec ! autovideosink
Thanks for any advice :D
Looks like you need to add 'queue' infront of the udpsink.
I'm able to receive correctly the image with autovideosink
gst-launch-1.0 udpsrc port=5001 ! application/x-rtp, encoding-name=JPEG,payload=26 ! rtpjpegdepay ! jpegparse ! jpegdec ! autovideosink
I have the same issue, it seems that the v4l2sink cannot accept jpeg ? heres simpler pipelines for those interested
This does not work
gst-launch-1.0 videotestsrc ! jpegenc ! v4l2sink device=/dev/video9
This does work
gst-launch-1.0 videotestsrc ! avenc_mjpeg ! v4l2sink device=/dev/video9
however, I find the avenc consumes much more processor than the jpegenc, and will drop frames
its seems the loopback/v4l2sink is pretty picky about its caps ?
I figured it out. The issue is that v4l2sink needs parsed jpegs. Looking at the caps of the avenc_mjpeg (which works with v4l2sink):
$ gst-inspect-1.0 avenc_mjpeg
...
Pad Templates:
SRC template: 'src'
Availability: Always
Capabilities:
image/jpeg
parsed: true
Then looking at the caps of jpegenc:
$ gst-inspect-1.0 jpegenc
...
SRC template: 'src'
Availability: Always
Capabilities:
image/jpeg
width: [ 16, 65535 ]
height: [ 16, 65535 ]
framerate: [ 0/1, 2147483647/1 ]
sof-marker: { (int)0, (int)1, (int)2, (int)4, (int)9 }
The key difference is the 'parsed' attribute. So to make your pipeline work with jpegs (or jpegenc) you can add parsed to the caps of the jpeg:
# launch the sink
gst-launch-1.0 videotestsrc ! jpegenc ! image/jpeg,parsed=true ! v4l2sink device=/dev/video9 &
# read from the src
gst-launch-1.0 v4l2src device=/dev/video9 ! jpegdec ! xvimagesink
Not sure, but the root cause may be that jpegenc decides its colorspace and sof-marker, and if not matching src caps it gives an upstream error.
A workaround is to use multipart mux/demux for changing colorspace to 2:4:7:1 and sof-marker 0 as expected by v4l2sink.
gst-launch-1.0 v4l2src device=/dev/video1 ! video/x-raw,format=YUY2,width=1344,height=376,framerate=30/1 ! videoconvert ! jpegenc ! multipartmux ! multipartdemux single-stream=1 ! 'image/jpeg, parsed=(boolean)true, width=(int)1344, height=(int)376, colorimetry=(string)2:4:7:1, framerate=(fraction)30/1,sof-marker=(int)0' ! v4l2sink device=/dev/video2 -v
There might be other ways, someone more skilled may further comment.
I have also had problems playing into the v4l2sink from a recorded file because of colorimetry.
In my case it helped to add a videoconvert before the v4l2sink: the CAPS now show with the correct colorimetry. Though I don't know how much processing is going into it
In my case it helped to add a videoconvert before the v4l2sink: the CAPS now show with the correct colorimetry. Though I don't know how much processing is going into it
Protip: if you add -v to your pipeline you should be able to see what videoconvert is doing, then you can look at your plugin sinks/sources and just add caps to ensure the plugins do the right thing. For example: jpegdec often output YUY2, but it also supports I420. In my experience I420 has less artifacts, and I'll often add the cap to ensure my decoder sends out I420
... ! jpegdec ! video/x-raw,format=I420 ! ...
If you use a videoconvert it'll convert the YUY2 to I420 which adds unecessary processing.
In my case it helped to add a videoconvert before the v4l2sink: the CAPS now show with the correct colorimetry. Though I don't know how much processing is going into it
Protip: if you add -v to your pipeline you should be able to see what videoconvert is doing, then you can look at your plugin sinks/sources and just add caps to ensure the plugins do the right thing. For example: jpegdec often output YUY2, but it also supports I420. In my experience I420 has less artifacts, and I'll often add the cap to ensure my decoder sends out I420
... ! jpegdec ! video/x-raw,format=I420 ! ...
If you use a videoconvert it'll convert the YUY2 to I420 which adds unecessary processing.
Yes, that true.
Unfortunately in my case I had an AVI container with RAW frames, and it wasn't able to output the necessary colorimetry (negotiation would fail if I tried to force it). For me it seems like a limitation or error in the avidemux...
So no way in getting around the videoconvert to fix the colorimetry.
But does anybody know if the colorimetry=2:4:7:1
is a limitation of gstreamer (v4l2sink) or on the v4l2 loopback itself?
But does anybody know if the
colorimetry=2:4:7:1
is a limitation of gstreamer (v4l2sink) or on the v4l2 loopback itself?
For this it's best to have a look at the output of strace
for all involved processes and see where that error message is generated and what causes it to print. From there you can analyze further if the failure is due to a syscall to v4l2loopback
or just from a logic error inside the various involved programs.
AFAICS, there's some restrictions on format information that v4l2loopback
puts on its processed video data (pixel formats are somewhat limited), but missing an conversion step is likely a user land issue.
But does anybody know if the
colorimetry=2:4:7:1
is a limitation of gstreamer (v4l2sink) or on the v4l2 loopback itself?For this it's best to have a look at the output of
strace
for all involved processes and see where that error message is generated and what causes it to print. From there you can analyze further if the failure is due to a syscall tov4l2loopback
or just from a logic error inside the various involved programs.AFAICS, there's some restrictions on format information that
v4l2loopback
puts on its processed video data (pixel formats are somewhat limited), but missing an conversion step is likely a user land issue.
It is definitely gstreamer that is giving the message about wrong colorimetry.
My question was more if it is the v4l2 loopback module that is imposing this limitation, as you mention with limited pixel formats. Or if it is a problem in the gstreamer implementation of v4l2sink that imposes this.
Hi,
Using a simple pipeline like this works:
BUT using
udpsrc
instead ofvideotestsrc
doesn't work (it does withautovideosink
, but not withv4l2sink
):From my raspi:
From my compute (crashing see above):