ArduCAM / MIPI_Camera

235 stars 109 forks source link

Gstreamer support for Jetvariety cameras #44

Open sputnick opened 4 years ago

sputnick commented 4 years ago

I'm finding the lack of code examples for the jetvariety cameras I bought for the Jetson Nano hard to swallow compared to the Raspberry Pi's examples and support. The lack of examples wouldn't be as bad if the cameras simply worked with the Nvidia examples here: https://github.com/dusty-nv/jetson-inference

However, pretty much every Nvidia example and tool for inference (the whole point of using Arducams with the Jetson platform) uses gstreamer (1.14.5.0) pipelines and those all fail with jetvariety cameras (/dev/video0 and /dev/video1). See screenshot below.

Neither gst-launch-1.0 with v4lsrc or nvarguscamerasrc work... Is this a bug? If not...how can I use gstreamer with Jetvariety cameras?

gstreamer-error

sputnick commented 4 years ago

FYI after building the Jetson-inference project there is a tool under build/aarch64/bin called "v4ls-display". Running this does not display the camera but is also does not throw an error like all the inference examples do.

$ ./v4l2-display /dev/video0
v4l2-display
  args (2):  0 [./v4l2-display]  1 [/dev/video0]  
v4l2-display:   attempting to initialize video device '/dev/video0'

v4l2 -- V4L2_CAP_VIDEO_CAPTURE yes
v4l2 -- V4L2_CAP_READWRITE no
v4l2 -- V4L2_CAP_ASYNCIO   no
v4l2 -- V4L2_CAP_STREAMING yes
v4l2 -- format #0
v4l2 --   desc   8-bit Greyscale
v4l2 --   flags  V4L2_FMT_FLAG_UNCOMPRESSED
v4l2 --   fourcc 0x59455247  UNKNOWN
v4l2 -- format #1
v4l2 --   desc   10-bit Greyscale
v4l2 --   flags  V4L2_FMT_FLAG_UNCOMPRESSED
v4l2 --   fourcc 0x20303159  UNKNOWN
v4l2 -- format #2
v4l2 --   desc   16-bit Greyscale
v4l2 --   flags  V4L2_FMT_FLAG_UNCOMPRESSED
v4l2 --   fourcc 0x20363159  UNKNOWN
v4l2 -- preexisting format
v4l2 --   width  1280
v4l2 --   height 800
v4l2 --   pitch  2560
v4l2 --   size   2048000
v4l2 --   format 0x20303159  UNKNOWN
v4l2 --   color  0x8
v4l2 --   field  0x1
v4l2 -- setting new format...
v4l2 --   width  1280
v4l2 --   height 800
v4l2 --   pitch  0
v4l2 --   size   0
v4l2 --   format 0x20303159  UNKNOWN
v4l2 --   color  0x8
v4l2 --   field  0x1
v4l2 -- confirmed new format
v4l2 --   width  1280
v4l2 --   height 800
v4l2 --   pitch  2560
v4l2 --   size   2048000
v4l2 --   format 0x20303159  UNKNOWN
v4l2 --   color  0x8
v4l2 --   field  0x1
v4l2 -- mapped 4 capture buffers with mmap

v4l2-display:  successfully initialized video device '/dev/video0'
    width:  1280
   height:  800
    depth:  16 (bpp)

v4l2-display:  un-initializing video device '/dev/video0'
[OpenGL] glDisplay -- X screen 0 resolution:  1920x1080
[OpenGL] glDisplay -- display device initialized
[OpenGL]   creating 1280x800 texture
v4l2-display:  initialized 1280 x 800 openGL texture (1024000 bytes)
v4l2-display:  video device '/dev/video0' has been un-initialized.
v4l2-display:  this concludes the test of video device '/dev/video0'
sputnick commented 4 years ago

FYI these manage to grab some frames using "videoconvert" and those formats.

gst-launch-1.0 v4l2src num-buffers=200 device=/dev/video0 ! videoconvert ! 'video/x-raw, format=YUY2, width=1280, height=800, framerate=30/1' ! videoconvert ! omxh264enc ! qtmux ! filesink location=test.mp4 -ev

gst-launch-1.0 v4l2src num-buffers=200 device=/dev/video0 ! videoconvert ! 'video/x-raw, format=GRAY8, width=1280, height=800, framerate=30/1' ! videoconvert ! omxh264enc ! qtmux ! filesink location=test.mp4 -ev

Neither "format" match what v4l2-ctl reports v4l2-ctl --list-formats-ext

ioctl: VIDIOC_ENUM_FMT
    Index       : 0
    Type        : Video Capture
    Pixel Format: 'GREY'
    Name        : 8-bit Greyscale
        Size: Discrete 1280x800
        Size: Discrete 1280x720
        Size: Discrete 640x400
        Size: Discrete 320x200

    Index       : 1
    Type        : Video Capture
    Pixel Format: 'Y10 '
    Name        : 10-bit Greyscale
        Size: Discrete 1280x800

    Index       : 2
    Type        : Video Capture
    Pixel Format: 'Y16 '
    Name        : 16-bit Greyscale
        Size: Discrete 1280x800
sputnick commented 4 years ago

This is the most minimal gst-launch-1.0 that works for me although I don't know what is needed for Jetvariety cams to work out of the box with Jetson-inference

gst-launch-1.0 v4l2src device="/dev/video0" ! 'video/x-raw, width=1280, height=800' ! videoconvert ! xvimagesink sync=false

If you omit the "sync=false" it will crash

sputnick commented 4 years ago

Some good news!

If you edit: jetson-inference/utils/camera/gstCamera.cpp

and change the pixel format from "YUY2" to "GRAY8" on line 432 the camera will show and inference is at least attempted. The performance isn't great, probably because of the double conversion that it needs to make when initializing the camera but it is hopefully a start for you guys at Arducam to optimize. Remember to rebuild the c++ source by doing

cd jetson-inference/build
make

You have to specify the width and height too in the command line for the examples to run without gst-launch-1.0 errors e.g.

cd jetson-inference/build/aarch64/bin/
./imagenet-camera --camera=/dev/video0 --width=1280 --height=800

I also added the "framerate" parameter as a hard coded option just to test but I'm not sure that does anything...

glddiv commented 4 years ago

Hi @sputnick The reason why it can't be used normally is that gstreamer does not support the bayer pixel format (10bit or higher). You can make it work because your camera just supports GRAY. If it is another camera, you can't make it work in this way. The use of other cameras requires manual conversion of pixel formats. At present, we have not provided similar code examples, so these tasks need to be completed by the user.

sputnick commented 4 years ago

I'm not following. What exactly is it that the "user" needs to implement?

Just as further info, I wanted to add that inference using "GRAY" to RGB (gstreamer pipeline) doesn't work. I mean the picture is shown and the models run but i.e. the simplest things like FaceNet do nothing and object detection is almost completely wrong.

Just using bayer and gstreamer as keywords there seem to be some implementations in the wild including this deprecated plugin https://developer.ridgerun.com/wiki/index.php?title=GStreamer_OpenCL_Bayer_to_RGB_converter

sputnick commented 4 years ago

Seems like there are others struggling with the same issue. We really need some leadership from Arducam to help out showing how to use your cameras for the number one purpose they are bought for, inference...

This user is having issues with Arducam stereoscopic cameras and inference... https://forums.developer.nvidia.com/t/exception-jetson-utils-failed-to-create-gldisplay-device/125071/16

ArduCAM commented 4 years ago

@sputnick I'm sorry to hear that you still have problem with our camera. Since you have already get clear image from the camera, one thing I am not sure about the training datasets come from the color image rather than this monochrome camera. So the trained model works well on standard IMX219 camera, and doesn't work well on black and white camera like OV9281.

ArduCAM commented 4 years ago

The stereoscopic IMX219 camera we provide only outputs unprocessed RAW video stream, it doesn't have ISP functions like auto white balance and auto exposure control etc. I am not sure if it also impact the inference with trained models.

Sedwin97 commented 4 years ago

@sputnick, here's what I got that works better. Videoconvert is unbearably slow on the Nano due to the hardware. Based off of this example from JetsonHacks, the best pipeline I got was the following: gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=1600, height=1300, format=GRAY8 ! nvvidconv ! nvegltransform ! nveglglessink sync=false -e

I'm not sure if the other formats are available. I am just not sure what the proper value to format needs to be for that.

Of course, the right size and format should be chosen based off of the output of v4l2-ctl --list-formats-ext -d /dev/video0

I'm still testing this, but for those using OpenCV with gstreamer can use the following: v4l2src device=/dev/video0 ! video/x-raw, width=1600, height=1300, format=GRAY8 ! nvvidconv ! appsink max-buffer=1 drop=true

razvanphp commented 4 years ago

I'm using arducam version of HQ camera IMX477 + jetvariety board, color, but also without much luck in gstreamer.

Camera is seen by v4l2 after kernel driver install:

jetson@nano:~$ v4l2-ctl --list-formats-ext -d0
ioctl: VIDIOC_ENUM_FMT
    Index       : 0
    Type        : Video Capture
    Pixel Format: 'BA81'
    Name        : 8-bit Bayer BGBG/GRGR
        Size: Discrete 4032x3040
        Size: Discrete 1920x1080
        Size: Discrete 1600x1200
        Size: Discrete 1280x960
        Size: Discrete 1280x720
        Size: Discrete 640x480

I can open it with this pipeline, but image is green, even tho debayered, and also framerate is very bad, max 15 fps no matter the resolution:

jetson@nano:~$ gst-launch-1.0 -v v4l2src device=/dev/video0 ! 'video/x-bayer, width=1920, height=1080, format=bggr'! bayer2rgb ! nvvidconv ! nvv4l2h264enc maxperf-enable=1 iframeinterval=10 ! 'video/x-h264, profile=(string)high, level=(string)4.1' ! rtph264pay! udpsink host=192.168.1.57 port=1234 sync=false
Setting pipeline to PAUSED ...
Opening in BLOCKING MODE
Pipeline is live and does not need PREROLL ...
Setting pipeline to PLAYING ...
New clock: GstSystemClock
/GstPipeline:pipeline0/GstV4l2Src:v4l2src0.GstPad:src: caps = video/x-bayer, width=(int)1920, height=(int)1080, format=(string)bggr, framerate=(fraction)120/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-bayer, width=(int)1920, height=(int)1080, format=(string)bggr, framerate=(fraction)120/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:src: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)120/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive, format=(string)BGRx
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:src: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)120/1, format=(string)I420, interlace-mode=(string)progressive
/GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)high, level=(string)4.1, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)120/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)high, level=(string)4.1, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)120/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)high, level=(string)4.1, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)120/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
/GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, profile=(string)high, level=(string)4.1, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)120/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2
Redistribute latency...
NvMMLiteOpen : Block : BlockType = 4
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4
/GstPipeline:pipeline0/nvv4l2h264enc:nvv4l2h264enc0.GstPad:sink: caps = video/x-raw(memory:NVMM), width=(int)1920, height=(int)1080, framerate=(fraction)120/1, format=(string)I420, interlace-mode=(string)progressive
/GstPipeline:pipeline0/Gstnvvconv:nvvconv0.GstPad:sink: caps = video/x-raw, width=(int)1920, height=(int)1080, framerate=(fraction)120/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive, format=(string)BGRx
/GstPipeline:pipeline0/GstBayer2RGB:bayer2rgb0.GstPad:sink: caps = video/x-bayer, width=(int)1920, height=(int)1080, format=(string)bggr, framerate=(fraction)120/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-bayer, width=(int)1920, height=(int)1080, format=(string)bggr, framerate=(fraction)120/1, colorimetry=(string)2:4:7:1, interlace-mode=(string)progressive
H264: Profile = 100, Level = 41
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, ssrc=(uint)1850625670, timestamp-offset=(uint)1364239141, seqnum-offset=(uint)800, a-framerate=(string)120
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, payload=(int)96, clock-rate=(int)90000, encoding-name=(string)H264, ssrc=(uint)1850625670, timestamp-offset=(uint)1364239141, seqnum-offset=(uint)800, a-framerate=(string)120
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, sprop-parameter-sets=(string)"Z2QMKawsqAeAIn5U\,aO48sA\=\=", payload=(int)96, seqnum-offset=(uint)800, timestamp-offset=(uint)1364239141, ssrc=(uint)1850625670, a-framerate=(string)120
/GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, sprop-parameter-sets=(string)"Z2QMKawsqAeAIn5U\,aO48sA\=\=", payload=(int)96, seqnum-offset=(uint)800, timestamp-offset=(uint)1364239141, ssrc=(uint)1850625670, a-framerate=(string)120
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 1364274129
/GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 800
^C

Does it have to do with lack of ISP in the camera? I would love to use nvarguscamerasrc to be able to use the built-in nvidia ISP, but the plugin says no camera detected.

Any hints in this case?

glddiv commented 4 years ago

Hi @razvanphp Yes, the reason for these phenomena is that the camera lacks ISP support. At present, the jetvariety series does not support the ISP built-in Jetson Nano, so you cannot use nvarguscamerasrc.

razvanphp commented 4 years ago

@glddiv ok, then what about AR1820HS camera (B0216)? I bought the one with ISP on it, but it still cannot be accessed with gstreamer:

Error generated. ...gstnvarguscamerasrc.cpp, execute:557 No cameras available
Got EOS from element "pipeline0".
jetson@nano:~$ v4l2-ctl --list-formats-ext
ioctl: VIDIOC_ENUM_FMT
    Index       : 0
    Type        : Video Capture
    Pixel Format: 'Y16 '
    Name        : 16-bit Greyscale
        Size: Discrete 640x480
        Size: Discrete 1216x920
        Size: Discrete 1920x1080
        Size: Discrete 2432x1842
        Size: Discrete 4896x3684

    Index       : 1
    Type        : Video Capture
    Pixel Format: 'BA10'
    Name        : 10-bit Bayer GRGR/BGBG
        Size: Discrete 640x480
        Size: Discrete 1216x920
        Size: Discrete 1920x1080
        Size: Discrete 2432x1842
        Size: Discrete 4896x3684

and this v4l2 pipeline runs, but shows a black image:

gst-launch-1.0 v4l2src device=/dev/video0 ! 'video/x-raw, width=1920, height=1080, format=GRAY16_LE' ! videoconvert ! nvvidconv ! nvv4l2h264enc maxperf-enable=1 iframeinterval=10 ! 'video/x-h264, profile=(string)high, level=(string)4.1' ! rtph264pay! udpsink host=192.168.1.57 port=1234 sync=false
glddiv commented 4 years ago

Hi @razvanphp We added a deb package with IMX477 driver in commit 0e4d7319e3227bf6fa905d43c724374443c570e6, this driver is provided by NVIDIA-Jetson-IMX477-RPIV3, it supports the ISP built-in Jetson Nano. If you want to use it, please remove jetvariety board, connect IMX477 directly to Jetson Nano

glddiv commented 4 years ago

@glddiv ok, then what about AR1820HS camera (B0216)? I bought the one with ISP on it, but it still cannot be accessed with gstreamer:

As far as I know AR1820HS camera (B0216) does not have an ISP.

razvanphp commented 4 years ago

@glddiv the IMX477 worked beautifully, thanks for that!

Only thing still needed, allowing higher FPS since the arducam has all 4 lanes connected, should reach 240fps in FullHD.