dusty-nv / jetson-inference

Hello AI World guide to deploying deep-learning inference networks and deep vision primitives with TensorRT and NVIDIA Jetson.
https://developer.nvidia.com/embedded/twodaystoademo
MIT License
7.84k stars 2.98k forks source link

Arducam: -- a timeout occurred waiting for the next image buffer #1775

Open rccraigb opened 10 months ago

rccraigb commented 10 months ago

Hello,

I have two Jetson systems, one an NVIDIA Jetson AGX Orin Developer Kit and another, a ConnectTech Hadron/Orin NX, which is headless. I was able to successfully run the detectnet binary within Docker using the Arducam on the developer kit, but I have not been able to get the camera to work on the Hadron/Orin using the exact same setup. I've tried recently to use video-viewer in headless mode with raw, reduced resolution, but the program stalls with the same error, no matter which test program I try: "waiting for the next image buffer". Any suggestions for resolving this would be appreciated.

Thank you! Craig

root@edge-mcm:/jetson-inference# video-viewer --headless --input-codec=raw --input-width=320 --input-height=240 /dev/video0 rtp://172.26.51.174:4815
[gstreamer] initialized gstreamer, version 1.16.3.0
[gstreamer] gstCamera -- attempting to create device v4l2:///dev/video0
[gstreamer] gstCamera -- found v4l2 device: Arducam B0459 (USB3 12MP)
[gstreamer] v4l2-proplist, device.path=(string)/dev/video0, udev-probed=(boolean)false, device.api=(string)v4l2, v4l2.device.driver=(string)uvcvideo, v4l2.device.card=(string)"Arducam\ B0459\ \(USB3\ 12MP\)", v4l2.device.bus_info=(string)usb-3610000.xhci-2, v4l2.device.version=(uint)330344, v4l2.device.capabilities=(uint)2225078273, v4l2.device.device_caps=(uint)69206017;
[gstreamer] gstCamera -- found 5 caps for v4l2 device /dev/video0
[gstreamer] [0] video/x-raw, format=(string)YUY2, width=(int)4056, height=(int)3040, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 10/1, 5/1 };
[gstreamer] [1] video/x-raw, format=(string)YUY2, width=(int)3840, height=(int)2160, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 20/1, 15/1, 10/1, 5/1 };
[gstreamer] [2] video/x-raw, format=(string)YUY2, width=(int)2016, height=(int)1520, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 50/1, 30/1, 15/1, 10/1, 5/1 };
[gstreamer] [3] video/x-raw, format=(string)YUY2, width=(int)1920, height=(int)1080, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 60/1, 30/1, 15/1, 10/1, 5/1 };
[gstreamer] [4] video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction){ 120/1, 60/1, 30/1, 15/1, 10/1, 5/1 };
[gstreamer] gstCamera -- selected device profile:  codec=raw format=yuyv width=1280 height=720
[gstreamer] gstCamera pipeline string:
[gstreamer] v4l2src device=/dev/video0 do-timestamp=true ! video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720 ! appsink name=mysink sync=false
[gstreamer] gstCamera successfully created device v4l2:///dev/video0
[video]  created gstCamera from v4l2:///dev/video0
------------------------------------------------
gstCamera video options:
------------------------------------------------
  -- URI: v4l2:///dev/video0
     - protocol:  v4l2
     - location:  /dev/video0
  -- deviceType: v4l2
  -- ioType:     input
  -- codec:      raw
  -- codecType:  cpu
  -- width:      1280
  -- height:     720
  -- frameRate:  120
  -- numBuffers: 4
  -- zeroCopy:   true
  -- flipMethod: none
------------------------------------------------
[gstreamer] gstEncoder -- codec not specified, defaulting to H.264
failed to find/open file /proc/device-tree/model
[gstreamer] gstEncoder -- detected board 'NVIDIA Orin NX'
[gstreamer] gstEncoder -- pipeline launch string:
[gstreamer] appsrc name=mysource is-live=true do-timestamp=true format=3 ! nvvidconv name=vidconv ! video/x-raw(memory:NVMM) ! nvv4l2h264enc name=encoder bitrate=4000000 insert-sps-pps=1 insert-vui=1 idrinterval=30 maxperf-enable=1 ! video/x-h264 ! rtph264pay config-interval=1 ! udpsink host=172.26.51.174 port=4815 auto-multicast=true
No EGL Display 
nvbufsurftransform: Could not get EGL display connection
[video]  created gstEncoder from rtp://172.26.51.174:4815
------------------------------------------------
gstEncoder video options:
------------------------------------------------
  -- URI: rtp://172.26.51.174:4815
     - protocol:  rtp
     - location:  172.26.51.174
     - port:      4815
  -- deviceType: ip
  -- ioType:     output
  -- codec:      H264
  -- codecType:  v4l2
  -- frameRate:  30
  -- bitRate:    4000000
  -- numBuffers: 4
  -- zeroCopy:   true
  -- latency     10
------------------------------------------------
[gstreamer] opening gstCamera for streaming, transitioning pipeline to GST_STATE_PLAYING
[gstreamer] gstreamer changed state from NULL to READY ==> mysink
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter0
[gstreamer] gstreamer changed state from NULL to READY ==> v4l2src0
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter0
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> v4l2src0
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline0
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer message new-clock ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> v4l2src0
[gstreamer] gstreamer message stream-start ==> pipeline0
[gstreamer] gstCamera -- onPreroll
[gstreamer] gstBufferManager recieve caps:  video/x-raw, format=(string)YUY2, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)120/1, interlace-mode=(string)progressive
[gstreamer] gstBufferManager -- recieved first frame, codec=raw format=yuyv width=1280 height=720 size=1843200
[cuda]   allocated 4 ring buffers (1843200 bytes each, 7372800 bytes total)
[cuda]   allocated 4 ring buffers (8 bytes each, 32 bytes total)
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysink
[gstreamer] gstreamer message async-done ==> pipeline0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysink
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> pipeline0
[cuda]   allocated 4 ring buffers (2764800 bytes each, 11059200 bytes total)
video-viewer:  captured 0 frames (1280x720)
[cuda]   allocated 2 ring buffers (1382400 bytes each, 2764800 bytes total)
[gstreamer] gstEncoder -- starting pipeline, transitioning to GST_STATE_PLAYING
[gstreamer] gstreamer message qos ==> v4l2src0
nvbuf_utils: Could not get EGL display connection
Opening in BLOCKING MODE 
[gstreamer] gstreamer changed state from NULL to READY ==> udpsink0
[gstreamer] gstreamer changed state from NULL to READY ==> rtph264pay0
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter2
[gstreamer] gstreamer changed state from NULL to READY ==> encoder
[gstreamer] gstreamer changed state from NULL to READY ==> capsfilter1
[gstreamer] gstreamer changed state from NULL to READY ==> vidconv
[gstreamer] gstreamer changed state from NULL to READY ==> mysource
[gstreamer] gstreamer changed state from NULL to READY ==> pipeline1
[gstreamer] gstreamer changed state from READY to PAUSED ==> rtph264pay0
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter2
[gstreamer] gstreamer changed state from READY to PAUSED ==> encoder
[gstreamer] gstreamer changed state from READY to PAUSED ==> capsfilter1
[gstreamer] gstreamer changed state from READY to PAUSED ==> vidconv
[gstreamer] gstreamer stream status CREATE ==> src
[gstreamer] gstreamer changed state from READY to PAUSED ==> mysource
[gstreamer] gstreamer changed state from READY to PAUSED ==> pipeline1
[gstreamer] gstreamer message new-clock ==> pipeline1
[gstreamer] gstreamer stream status ENTER ==> src
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> rtph264pay0
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter2
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> encoder
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> capsfilter1
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> vidconv
[gstreamer] gstreamer changed state from PAUSED to PLAYING ==> mysource
[gstreamer] gstEncoder -- new caps: video/x-raw, width=1280, height=720, format=(string)I420, framerate=30/1
video-viewer:  captured 1 frames (1280x720)
NvMMLiteOpen : Block : BlockType = 4 
===== NVMEDIA: NVENC =====
NvMMLiteBlockCreate : Block : BlockType = 4 
[gstreamer] gstreamer message stream-start ==> pipeline1
[gstreamer] gstreamer message latency ==> encoder
H264: Profile = 66, Level = 0 
NVMEDIA: Need to set EMC bandwidth : 376000 
NVMEDIA_ENC: bBlitMode is set to TRUE 
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
^Creceived SIGINT
[gstreamer] gstCamera::Capture() -- a timeout occurred waiting for the next image buffer
video-viewer:  shutting down...
[gstreamer] gstCamera -- stopping pipeline, transitioning to GST_STATE_NULL
[gstreamer] gstCamera -- pipeline stopped
[gstreamer] gstEncoder -- shutting down pipeline, sending EOS
[gstreamer] gstEncoder -- transitioning pipeline to GST_STATE_NULL
[gstreamer] gstEncoder -- pipeline stopped
video-viewer:  shutdown complete
rccraigb commented 10 months ago

More info: I'm beginning to suspect that it is a throughput issue. When I try to sample video on the Hadron/Orin outside of the Docker container, the frame rate is abysmal:

mcadmin@edge-mcm:~$ ffmpeg -f v4l2 -framerate 1 -video_size 640x360 -input_format yuyv422 -i /dev/video0 -c copy out.mkv
ffmpeg version 4.2.7-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers
  built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
  configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/aarch64-linux-gnu --incdir=/usr/include/aarch64-linux-gnu --arch=arm64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
  libavutil      56. 31.100 / 56. 31.100
  libavcodec     58. 54.100 / 58. 54.100
  libavformat    58. 29.100 / 58. 29.100
  libavdevice    58.  8.100 / 58.  8.100
  libavfilter     7. 57.100 /  7. 57.100
  libavresample   4.  0.  0 /  4.  0.  0
  libswscale      5.  5.100 /  5.  5.100
  libswresample   3.  5.100 /  3.  5.100
  libpostproc    55.  5.100 / 55.  5.100
[video4linux2,v4l2 @ 0xaaab0e32c800] The V4L2 driver changed the video from 640x360 to 1280x720
[video4linux2,v4l2 @ 0xaaab0e32c800] The driver changed the time per frame from 1/1 to 1/5
Input #0, video4linux2,v4l2, from '/dev/video0':
  Duration: N/A, start: 6935.544640, bitrate: 73728 kb/s
    Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1280x720, 73728 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
File 'out.mkv' already exists. Overwrite ? [y/N] y
Output #0, matroska, to 'out.mkv':
  Metadata:
    encoder         : Lavf58.29.100
    Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1280x720, q=2-31, 73728 kb/s, 5 fps, 5 tbr, 1k tbn, 1000k tbc
Stream mapping:
  Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
frame=    9 fps=0.5 q=-1.0 Lsize=   16201kB time=00:00:01.60 bitrate=82897.2kbits/s speed=0.0863x    
video:16200kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.005986%
dusty-nv commented 10 months ago

OK I see, yes I was going to suggest trying your V4L2 USB camera with another utility to deduce if it's related to GStreamer (or my code), or moreso the connection / system configuration. While I haven't used ffmpeg to capture V4L2 (cool trick btw), it would seem that's having issues too. You could try cheese too if that works.

There are probably a bunch of potential reasons why USB camera behave differently between devkit and a customized carrier/system, such as perhaps is that USB3 connection working on the carrier for example? What power mode is the Hadron running in?

If it does seem like the root of the issue is not jetson-inference, then folks more knowledgeable than I in those areas can probably help from a more general question relating to camera on the Jetson forums or ConnectTech support (probably try both)

rccraigb commented 10 months ago

I tried an old USB2.0 camera and had better luck, so this seems to be related to throughput over the hand-built cable to the carrier board jumper. Indeed, at the full resolution/frame rate of the older camera, it still did not work. However, it seems that by narrowing down the height and width of the frame, the stream is supported without errors:

detectnet --input-width=640 --input-height=480 --headless /dev/video0 file.mp4

Thank you for your help and especially for jetson-inference -- it is very impressive!!

dusty-nv commented 10 months ago

Aha, okay - glad you figured out the culprit! My pleasure to support jetson developers, good luck on your project!