alliedvision / linux_nvidia_jetson

Allied Vision CSI-2 camera driver for NVIDIA Jetson Systems.
104 stars 41 forks source link

Precompiled kernel not working #13

Closed jsalazar30 closed 2 years ago

jsalazar30 commented 3 years ago

Recently I got the Alvium Camera kit of the link below https://www.alliedvision.com/en/products/alvium-camera-kit.html I am trying to connect to an nvidia jetson nano

I've been trying to make it work using the precomiled kernel in https://www.alliedvision.com/en/products/software/embedded-software-and-drivers.html

No matter what i try, is not possible to find a video device /dev/video0 is not created. Is there any detailed instructions for installation I am not shure what i am doing wrong.

Also, I found an sd image available for jetson nano but the link is broken

qbmestriaux commented 3 years ago

Exactly the same for me. Following the instructions and no video found in /dev. Tried to connect a rpi2 camera and that works, but before installing avt driver. After avt driver install, the roi camera is not found.

BernardoLuck commented 3 years ago

Hello,

we have found out that new Nanos are delivered with a newer version of the bootloader on the module. This is creating an issue with the install script. The device tree file cannot be not updated with the newest bootloader. Therefore, the camera cannot be found. A workaround is to use the Nvidia SDK and to manually install the correct version of the JetPack 4.4 DP (L4T 32.4.2). This will downgrade the bootloader on the module, this will allow the script to install correctly. This issue will be fixed on the next version of the driver.

Best regards

Bernardo Luck Villanueva // Applications Engineer at Allied Vision

qbmestriaux commented 3 years ago

Hello Bernardo,

Thank you very much for your reply. I've flashed my nano with the nvidia SDK, but something still goes wrong. Here is the output of "apt show nvidia-jetpack -a": Package: nvidia-jetpack Version: 4.4.1-b50 Priority: standard Section: metapackages Maintainer: NVIDIA Corporation Installed-Size: 199 kB Depends: nvidia-cuda (= 4.4.1-b50), nvidia-opencv (= 4.4.1-b50), nvidia-cudnn8 (= 4.4.1-b50), nvidia-tensorrt (= 4.4.1-b50), nvidia-visionworks (= 4.4.1-b50), nvidia-container (= 4.4.1-b50), nvidia-vpi (= 4.4.1-b50), nvidia-l4t-jetson-multimedia-api (>> 32.4-0), nvidia-l4t-jetson-multimedia-api (<< 32.5-0) Homepage: http://developer.nvidia.com/jetson Download-Size: 29,4 kB APT-Sources: https://repo.download.nvidia.com/jetson/t210 r32.4/main arm64 Packages Description: NVIDIA Jetpack Meta Package

Package: nvidia-jetpack Version: 4.4-b186 Priority: standard Section: metapackages Maintainer: NVIDIA Corporation Installed-Size: 199 kB Depends: nvidia-cuda (= 4.4-b186), nvidia-opencv (= 4.4-b186), nvidia-cudnn8 (= 4.4-b186), nvidia-tensorrt (= 4.4-b186), nvidia-visionworks (= 4.4-b186), nvidia-container (= 4.4-b186), nvidia-vpi (= 4.4-b186), nvidia-l4t-jetson-multimedia-api (>> 32.4-0), nvidia-l4t-jetson-multimedia-api (<< 32.5-0) Homepage: http://developer.nvidia.com/jetson Download-Size: 29,3 kB APT-Sources: https://repo.download.nvidia.com/jetson/t210 r32.4/main arm64 Packages Description: NVIDIA Jetpack Meta Package

Package: nvidia-jetpack Version: 4.4-b144 Priority: standard Section: metapackages Maintainer: NVIDIA Corporation Installed-Size: 200 kB Depends: nvidia-container-csv-cuda (= 10.2.89-1), libopencv-python (= 4.1.1-2-gd5a58aa75), libvisionworks-sfm-dev (= 0.90.4.501), libvisionworks-dev (= 1.6.0.501), libnvparsers7 (= 7.1.0-1+cuda10.2), libnvinfer-plugin-dev (= 7.1.0-1+cuda10.2), libnvonnxparsers7 (= 7.1.0-1+cuda10.2), libnvinfer-samples (= 7.1.0-1+cuda10.2), libnvinfer-bin (= 7.1.0-1+cuda10.2), libvisionworks-samples (= 1.6.0.501), libvisionworks-tracking-dev (= 0.88.2.501), vpi-samples (= 0.2.0), tensorrt (= 7.1.0.16-1+cuda10.2), libopencv (= 4.1.1-2-gd5a58aa75), libnvinfer-doc (= 7.1.0-1+cuda10.2), libnvparsers-dev (= 7.1.0-1+cuda10.2), libnvidia-container0 (= 0.9.0~beta.1), nvidia-container-csv-visionworks (= 1.6.0.501), cuda-toolkit-10-2 (= 10.2.89-1), graphsurgeon-tf (= 7.1.0-1+cuda10.2), libcudnn8 (= 8.0.0.145-1+cuda10.2), libopencv-samples (= 4.1.1-2-gd5a58aa75), nvidia-container-csv-cudnn (= 8.0.0.145-1+cuda10.2), python-libnvinfer-dev (= 7.1.0-1+cuda10.2), libnvinfer-plugin7 (= 7.1.0-1+cuda10.2), libvisionworks (= 1.6.0.501), libcudnn8-doc (= 8.0.0.145-1+cuda10.2), nvidia-container-toolkit (= 1.0.1-1), libnvinfer-dev (= 7.1.0-1+cuda10.2), nvidia-l4t-jetson-multimedia-api (>> 32.4-0), nvidia-l4t-jetson-multimedia-api (<< 32.5-0), libopencv-dev (= 4.1.1-2-gd5a58aa75), vpi-dev (= 0.2.0), vpi (= 0.2.0), libcudnn8-dev (= 8.0.0.145-1+cuda10.2), python3-libnvinfer (= 7.1.0-1+cuda10.2), python3-libnvinfer-dev (= 7.1.0-1+cuda10.2), opencv-licenses (= 4.1.1-2-gd5a58aa75), nvidia-container-csv-tensorrt (= 7.1.0.16-1+cuda10.2), libnvinfer7 (= 7.1.0-1+cuda10.2), libnvonnxparsers-dev (= 7.1.0-1+cuda10.2), uff-converter-tf (= 7.1.0-1+cuda10.2), nvidia-docker2 (= 2.2.0-1), libvisionworks-sfm (= 0.90.4.501), libnvidia-container-tools (= 0.9.0~beta.1), nvidia-container-runtime (= 3.1.0-1), python-libnvinfer (= 7.1.0-1+cuda10.2), libvisionworks-tracking (= 0.88.2.501) Conflicts: cuda-command-line-tools-10-0, cuda-compiler-10-0, cuda-cublas-10-0, cuda-cublas-dev-10-0, cuda-cudart-10-0, cuda-cudart-dev-10-0, cuda-cufft-10-0, cuda-cufft-dev-10-0, cuda-cuobjdump-10-0, cuda-cupti-10-0, cuda-curand-10-0, cuda-curand-dev-10-0, cuda-cusolver-10-0, cuda-cusolver-dev-10-0, cuda-cusparse-10-0, cuda-cusparse-dev-10-0, cuda-documentation-10-0, cuda-driver-dev-10-0, cuda-gdb-10-0, cuda-gpu-library-advisor-10-0, cuda-libraries-10-0, cuda-libraries-dev-10-0, cuda-license-10-0, cuda-memcheck-10-0, cuda-misc-headers-10-0, cuda-npp-10-0, cuda-npp-dev-10-0, cuda-nsight-compute-addon-l4t-10-0, cuda-nvcc-10-0, cuda-nvdisasm-10-0, cuda-nvgraph-10-0, cuda-nvgraph-dev-10-0, cuda-nvml-dev-10-0, cuda-nvprof-10-0, cuda-nvprune-10-0, cuda-nvrtc-10-0, cuda-nvrtc-dev-10-0, cuda-nvtx-10-0, cuda-samples-10-0, cuda-toolkit-10-0, cuda-tools-10-0, libcudnn7, libcudnn7-dev, libcudnn7-doc, libnvinfer-plugin6, libnvinfer6, libnvonnxparsers6, libnvparsers6 Homepage: http://developer.nvidia.com/jetson Download-Size: 30,4 kB APT-Sources: https://repo.download.nvidia.com/jetson/t210 r32.4/main arm64 Packages Description: NVIDIA Jetpack Meta Package

I've installed the avt driver, and still have the problem according to the output of dmesg | grep avt : [ 1.240938] avt_csi2 7-003c: i2c read failed (-121) [ 1.240944] avt_csi2 7-003c: avt_csi2_probe: read_cci_registers failed: -121 [ 1.240954] avt_csi2: probe of 7-003c failed with error -121 [ 1.241229] avt_csi2 8-003c: i2c read failed (-121) [ 1.241235] avt_csi2 8-003c: avt_csi2_probe: read_cci_registers failed: -121 [ 1.241243] avt_csi2: probe of 8-003c failed with error -121

The other problem is that Jetpack 4.4 DP is not proposed in the list of the availble target operating systems on the nvidia SDK manager. There is the 4.5, 4.4.1, and 4.4, but not 4.4 DP. So how can I install this 4.4DP ? I've already did it before by flashing the SD card with version 4.4 DP and it was not working.

Thank you for your answer, Best Regards,

BernardoLuck commented 3 years ago

Hello!

the output of dmesg | grep avt is perfect! this means the driver has been installed! The error -121 means that the driver cannot find the camera physically. So, it is either the correct installation of the cables or our adapter board has not been powered up before the carrier board. It is important not to power up the adapter with the USB port from your DevKit. The Devkit poweres up and then it looks for all CSI-2 devices connected, then it powers up the USB controller. This means, if you connect the our adapter board to the USB port of the DevKit, it will be powered up when the search for the CSI-2 devices starts. So, the camera will not be found. Then you will see this error -121. The suggestion is to use an external USB-power supply and power the adapter board before powering up the Devkit

Best Regards,

qbmestriaux commented 3 years ago

Hello, thanks again for the fast reply I've plugged the board as you mentioned, with an external power supply for the alvium first (5V, 2.5A), and then only power the jetson (4A, 5V power supply through the jack). now, a dmesg | grep avt gives nothing as output. and when I'm searching for /dev/video, there is nothing appearing. I think I need a /dev/video* file to use the examples from github, right ?

The leds on the camera and on the intermediate small board are green. Regarding the connections : for the first cable (camera --> small board) : the "A" character from "CAMERA3 marking is close to the green light of the alvium. On the small board side, the "T" of "HOST" is next to the small board green light.

For the second flex cable (small board to jetson nano) : the contacts are towards the interior of the small board on the small board; on the other side the contacts are towards interior of the jetson nano. This flex cable is connected on the "cam0" port of the nano.

What would be wrong ? Best Regards,

BernardoLuck commented 3 years ago

Hi, my colleague Arun has just sent you an email from our ticket support. We forgot to mention that the 2.5A may a little high. Therefore, I would suggest to use a 1.5 to max 2.0A power supply. One last idea, if dmesg does not show anything, the driver has been installed, then the bootloader may have been updated. Be sure, that you do not run apt upgrade. Otherwise, L4T will be updated. This can be recognized because the first screenshot will be white with the green logo from Nvidia. Our driver will run if this first screenshot has a black background with the green Nvidia logo.

Best regards.

guillebot commented 3 years ago

Why 2.5A is high? Yo don't decide how many amps is going to draw. It's only how many are available.

You can never have a "too many amps" power supply.

On Tue, Feb 9, 2021 at 1:26 PM BernardoLuck notifications@github.com wrote:

Hi, my colleague Arun has just sent you an email from our ticket support. We forgot to mention that the 2.5A may a little high. Therefore, I would suggest to use a 1.5 to max 2.0A power supply. One last idea, if dmesg does not show anything, the driver has been installed, then the bootloader may have been updated. Be sure, that you do not run apt upgrade. Otherwise, L4T will be updated. This can be recognized because the first screenshot will be white with the green logo from Nvidia. Our driver will run if this first screenshot has a black background with the green Nvidia logo.

Best regards.

— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHub https://github.com/alliedvision/linux_nvidia_jetson/issues/13#issuecomment-776063158, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAIXVWK3WI52IAQDKKETZU3S6FO2HANCNFSM4WOUV4NQ .

SteveC-LO commented 3 years ago

Experiencing the same issues. Can you please confirm the versions of Jetpack to install for the prebuilt driver. Instructions here require JetPack 4.4.1 (L4T 32.4.4). This does not appear to work. I have also tried JetPack 4.4DP (L4T 32.4.2). dmesg indicates that the avt drivers have not loaded :

[ 1.763352] imx219 8-0010: imx219_board_setup: error during i2c read probe (-121)

The thread above suggests some issues with the bootloader?

BernardoLuck commented 3 years ago

Hi Steve,

[ 1.763352] imx219 8-0010: imx219_board_setup: error during i2c read probe (-121) You are correct, I can see that our driver was not loaded. There are some issues with the newest version of the bootloader. As you may know, it is installed in the module. If you upgrade your system to Jetpack 4.5 or bought a new module, then you will have latest version of the bootloader. Unfortunately, if you use a SD-image card with Jetpack 4.4DP or Jetpack 4.4.1 and our drivers are installed, then they will not get uploaded as long as the newest bootloader is installed. The workaround is to use a Linux-PC, install there the Nvidia SDK Manager, then select the Jetpack 4.4.1 and flash the module. This will downgrade the bootloader and allow our driver to be loaded after running the install script. At the moment, this is the only workaround. Our guys from R&D are working on getting a fix this issue for the latest bootloader.

SteveC-LO commented 3 years ago

Thank you Bernardo, it is now clear I should give the SDK manager a go!

Best,

Steve

On 18 Feb 2021, at 08:11, BernardoLuck notifications@github.com<mailto:notifications@github.com> wrote:

Hi Steve,

[ 1.763352] imx219 8-0010: imx219_board_setup: error during i2c read probe (-121) You are correct, I can see that our driver was not loaded. There are some issues with the newest version of the bootloader. As you may know, it is installed in the module. If you upgrade your system to Jetpack 4.5 or bought a new module, then you will have latest version of the bootloader. Unfortunately, if you use a SD-image card with Jetpack 4.4DP or Jetpack 4.4.1 and our drivers are installed, then they will not get uploaded as long as the newest bootloader is installed. The workaround is to use a Linux-PC, install there the Nvidia SDK Manager, then select the Jetpack 4.4.1 and flash the module. This will downgrade the bootloader and allow our driver to be loaded after running the install script. At the moment, this is the only workaround. Our guys from R&D are working on getting a fix this issue for the latest bootloader.

— You are receiving this because you commented. Reply to this email directly, view it on GitHubhttps://github.com/alliedvision/linux_nvidia_jetson/issues/13#issuecomment-781139813, or unsubscribehttps://github.com/notifications/unsubscribe-auth/AOYD4ZDZ7WHB2WQMQBZMCCTS7TDUXANCNFSM4WOUV4NQ.

ritvik-ranadive commented 3 years ago

Hi,

I am facing the same problem that @qbmestriaux mentioned. Before installing the precompiled kernel shared at https://www.alliedvision.com/en/products/software/embedded-software-and-drivers/ the RasPi V2 camera was working on the Jetson Xavier NX but the Alvium 1800 C-158 was not working. After installing the precompiled kernel, the Alvium 1800 C-158 works but the RasPi V2 camera no longer works.

Regards, Ritvik

BernardoLuck commented 3 years ago

Hi Ritvik,

when you install the MIPI Allied Vision driver then the DeviceTree is changed so that any of our cameras can be used on any of the CSI-2 ports of the Nano. No other camera can be connected with our implementation. However, you could rewrite our implementation of the Device Tree so that one port is selected for the Nano and the other port for your RasPi camera. Regards, Bernardo

amarburg commented 3 years ago

we have found out that new Nanos are delivered with a newer version of the bootloader on the module. This is creating an issue with the install script. The device tree file cannot be not updated with the newest bootloader. Therefore, the camera cannot be found. A workaround is to use the Nvidia SDK and to manually install the correct version of the JetPack 4.4 DP (L4T 32.4.2). This will downgrade the bootloader on the module, this will allow the script to install correctly. This issue will be fixed on the next version of the driver.

I'm tracking this issue while debugging a problem with a Xavier NX Developers Kit. Is it possible to tell if one has a "good" or "bad" bootloader? For example from messages to the serial debug port while booting?

krs1980 commented 3 years ago

Hi Steve,

[ 1.763352] imx219 8-0010: imx219_board_setup: error during i2c read probe (-121) You are correct, I can see that our driver was not loaded. There are some issues with the newest version of the bootloader. As you may know, it is installed in the module. If you upgrade your system to Jetpack 4.5 or bought a new module, then you will have latest version of the bootloader. Unfortunately, if you use a SD-image card with Jetpack 4.4DP or Jetpack 4.4.1 and our drivers are installed, then they will not get uploaded as long as the newest bootloader is installed. The workaround is to use a Linux-PC, install there the Nvidia SDK Manager, then select the Jetpack 4.4.1 and flash the module. This will downgrade the bootloader and allow our driver to be loaded after running the install script. At the moment, this is the only workaround. Our guys from R&D are working on getting a fix this issue for the latest bootloader.

This method really works. After following the steps, the Jetson Nano detects camera.

guillebot commented 3 years ago

Hi guys, as of June 2021, a couple of questions:

  1. Is this still the only method that works?
  2. Do you plan to upgrade source and/or precompiled to support JetPack 4.5.1?

Thanks

AVTechie commented 3 years ago

Support for JetPack 4.5.1 in source and precompiled coming latest on Monday.