Open rccraigb opened 10 months ago
More info: I'm beginning to suspect that it is a throughput issue. When I try to sample video on the Hadron/Orin outside of the Docker container, the frame rate is abysmal:
mcadmin@edge-mcm:~$ ffmpeg -f v4l2 -framerate 1 -video_size 640x360 -input_format yuyv422 -i /dev/video0 -c copy out.mkv
ffmpeg version 4.2.7-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/aarch64-linux-gnu --incdir=/usr/include/aarch64-linux-gnu --arch=arm64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
libavutil 56. 31.100 / 56. 31.100
libavcodec 58. 54.100 / 58. 54.100
libavformat 58. 29.100 / 58. 29.100
libavdevice 58. 8.100 / 58. 8.100
libavfilter 7. 57.100 / 7. 57.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 5.100 / 5. 5.100
libswresample 3. 5.100 / 3. 5.100
libpostproc 55. 5.100 / 55. 5.100
[video4linux2,v4l2 @ 0xaaab0e32c800] The V4L2 driver changed the video from 640x360 to 1280x720
[video4linux2,v4l2 @ 0xaaab0e32c800] The driver changed the time per frame from 1/1 to 1/5
Input #0, video4linux2,v4l2, from '/dev/video0':
Duration: N/A, start: 6935.544640, bitrate: 73728 kb/s
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1280x720, 73728 kb/s, 5 fps, 5 tbr, 1000k tbn, 1000k tbc
File 'out.mkv' already exists. Overwrite ? [y/N] y
Output #0, matroska, to 'out.mkv':
Metadata:
encoder : Lavf58.29.100
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1280x720, q=2-31, 73728 kb/s, 5 fps, 5 tbr, 1k tbn, 1000k tbc
Stream mapping:
Stream #0:0 -> #0:0 (copy)
Press [q] to stop, [?] for help
frame= 9 fps=0.5 q=-1.0 Lsize= 16201kB time=00:00:01.60 bitrate=82897.2kbits/s speed=0.0863x
video:16200kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 0.005986%
OK I see, yes I was going to suggest trying your V4L2 USB camera with another utility to deduce if it's related to GStreamer (or my code), or moreso the connection / system configuration. While I haven't used ffmpeg to capture V4L2 (cool trick btw), it would seem that's having issues too. You could try cheese
too if that works.
There are probably a bunch of potential reasons why USB camera behave differently between devkit and a customized carrier/system, such as perhaps is that USB3 connection working on the carrier for example? What power mode is the Hadron running in?
If it does seem like the root of the issue is not jetson-inference, then folks more knowledgeable than I in those areas can probably help from a more general question relating to camera on the Jetson forums or ConnectTech support (probably try both)
I tried an old USB2.0 camera and had better luck, so this seems to be related to throughput over the hand-built cable to the carrier board jumper. Indeed, at the full resolution/frame rate of the older camera, it still did not work. However, it seems that by narrowing down the height and width of the frame, the stream is supported without errors:
detectnet --input-width=640 --input-height=480 --headless /dev/video0 file.mp4
Thank you for your help and especially for jetson-inference -- it is very impressive!!
Aha, okay - glad you figured out the culprit! My pleasure to support jetson developers, good luck on your project!
Hello,
I have two Jetson systems, one an NVIDIA Jetson AGX Orin Developer Kit and another, a ConnectTech Hadron/Orin NX, which is headless. I was able to successfully run the detectnet binary within Docker using the Arducam on the developer kit, but I have not been able to get the camera to work on the Hadron/Orin using the exact same setup. I've tried recently to use video-viewer in headless mode with raw, reduced resolution, but the program stalls with the same error, no matter which test program I try: "waiting for the next image buffer". Any suggestions for resolving this would be appreciated.
Thank you! Craig