Open thielepaul opened 1 year ago
@thielepaul, Unfortunately JP5.x will not run on a Nano 4GB Developer Kit though it will run on a Nano Orin SOC. I have JP5.02 installed on a Xavier NX 8GB running DC 4.4.dev4.
I believe that JP4.6.1/2 is the last JP version that will run on the Nano 4GB Developer Kit and we are presently testing the DC tf_2_9
branch for the installation of DC 4.4.dev6 on JP4.6.1/2.
The tf_2_9
branch installs Python3.9, tf2.9 and opencv4.6 + GStreamer on JP4.6.1/2 using a Mambaforge environment in place of the Python environment.
We built a tf2.9
wheel for Python3.9, but opencv4.6 + GStreamer must be built from source which takes about two hours.
We are presently beta testing the DC JP4.6.1/2 installation instructions for JP4.6.1/2 and should be able to push tf_2_9
into the DC main
branch shortly.
TCIII
Thanks you for your clarifying that for me and its nice to hear that there is development going on updating donkey car to support jetpack 4.6.1!
I suggest we close this issue once the changes are merged to master?
@thielepaul,
Hopefully you are familiar with CMake to build opencv4.6 + GStreamer from source. I tried to build a wheel for opencv4.6 + GStreamer but was unsuccessful no matter what we tried. I recommend having a fan on the Nano 4GB heat sink (do not attempt to use a Nano 2GB) and the Nano plugged into a 5 vdc power supply using a UPS as the build portion uses all four processors and takes about two hours. We had to build opencv4.6 + GStreamer from source because the conda version of opencv4.6 does not include GStreamer. It took over 81 hours to successfully build the tensorflow 2.9 wheel.
TCIII
Yes, it would be a far nicer use experience, if it would not be necessary to build it manually :grimacing:
Is it maybe possible to use a different base os, e.g. armbian so we can just install the packages in more recent versions from the distro repositories? https://www.armbian.com/jetson-nano/ latest download: https://github.com/armbian/community/releases/download/202307/Armbian_23.02.0-trunk_Jetson-nano_lunar_edge_6.2.0-rc6.img.xz This seems to come with opencv4.6: https://packages.ubuntu.com/lunar/python3-opencv
I just started playing around with the jetson nano yesterday so I haven't tried much so far (I got DC 4.4 running on jetpack 4.5.1)
@thielepaul
Tensorflow 2.9 and opencv 4.6 are fairly recent versions, though not the latest, and have been found to work well with the DC training. I doubt that armbian will be considered as a replacement for JP4.6.1/2/Ubuntu 18.04 as it is well supported by the IoT.
TCIII
Why not send us a basic manual for installing JP46 and TF29?
@Ezward @DocGarbanzo
@Ahrovan, Are you willing to spend two hours building and installing opencv4.6 + GStreamer? I run my Nano's 5 vdc power supply on a UPS so that powerline glitches and outages will not affect the building and installing opencv4.6 + GStreamer with CMake. Additionally, we had to use mambaforge, in place of the Python env, to be able to install Python 3.9. Some users don't like conda.
TCIII
I am working about 1days on ---JP46 It doesn't matter. It will help if you provide how to do it.
@Ahrovan,
I am just a software/hardware tester who validated the DC tf_2_9 branch installation instructions so it will be up to DocGarbanzo as to the release of the Beta installation instructions.
TCIII
@Ahrovan - there is a branch Update-to-tf-2_9
in the donkeydocs repo which contains the installation instructions for the tf_2_9
branch. Feel free to try this out and let us know if you run into any problems because @TCIII and me are still updating these instructions.
@DocGarbanzo tf_2_9 installed on Windows & Javier NX until now without problem (with small changes in install steps)
@Ahrovan,
Could you please the document the "with small changes in install steps"?
Did you install JP5.x on the Xavier NX?
TCIII
JP502 installed on Javier NX
Step 1
wget https://github.com/conda-forge/miniforge/releases/latest/download/Mambaforge-Linux-aarch64.sh
chmod u+x Mambaforge-Linux-aarch64.sh
./Mambaforge-Linux-aarch64.sh
Step 2
mkdir projects
cd projects
git clone https://github.com/autorope/donkeycar
cd donkeycar
git checkout tf_2_9
mamba env create -f install/envs/jetson.yml
conda activate donkey
pip install -e .[nano]
donkeycar/tree/tf_2_9/install/envs/jetson.yml
...
- pip:
- simple-pid
removed - duplicated with setup.py kivy-jetson git+https://github.com/autorope/keras-vis.git
moved as separate step pip3 install --pre --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v502 tensorflow==2.9.1+nv22.09
solve problem, need to change keras-vis @ git+https://github.com/autorope/keras-vis.git@master
just for test, donkey ui, don't work properly in JP502 @TCIII @DocGarbanzo
@Ahrovan,
- just for test, donkey ui, don't work properly in JP502 @TCIII @DocGarbanzo
Very strange as I have had Donkey UI working on my Xavier NX 8GB running JP502 and the tf_2_9 branch for a number of months now.
TCIII
JP502 installed on Javier NX
- Step 1
wget https://github.com/conda-forge/miniforge/releases/latest/download/Mambaforge-Linux-aarch64.sh chmod u+x Mambaforge-Linux-aarch64.sh ./Mambaforge-Linux-aarch64.sh
* Step 2
mkdir projects cd projects git clone https://github.com/autorope/donkeycar cd donkeycar git checkout tf_2_9 mamba env create -f install/envs/jetson.yml conda activate donkey pip install -e .[nano]
* donkeycar/tree/tf_2_9/install/envs/jetson.yml
... - pip: - simple-pid
* removed - duplicated with setup.py kivy-jetson git+https://github.com/autorope/keras-vis.git * moved as separate step pip3 install --pre --extra-index-url https://developer.download.nvidia.com/compute/redist/jp/v502 tensorflow==2.9.1+nv22.09 * solve problem, need to change keras-vis @ git+https://github.com/autorope/keras-vis.git@master * just for test, donkey ui, don't work properly in JP502 @TCIII @DocGarbanzo
If you don't do the changes mentioned here, does the install fail? Could you paste the error? Could you also paste the error from the UI? Or is it running without error and behaving unexpectedly?
@DocGarbanzo,
./Mambaforge-Linux-aarch64.sh
S/B: bash ./Mambaforge-Linux-aarch64.sh
TCIII
@DocGarbanzo,
I have a spare Xavier NX that is running an older version of DC that I can attempt to install JP502 and the DC tf_2_9 branch this weekend.
TCIII
@Ahrovan,
Are you using your Xavier NX for training or driving? I use mine for training only.
Are you using a SSD with your Xavier NX?
I am running a 250 GB SSD on my Xavier NX and I have JP booting up on the SSD.
I did not install the swap file on the Xavier NX until I had JP booting up on the SSD.
I used the attached file to create a 8 GB swap file in addition to the 3.3 GB of zram.
@TCIII
@DocGarbanzo My changes is not very important. main steps is correct. Just need some small modify. i will share Error if check Again
@Ahrovan,
Are you booting JP from the SSD?
If so, did you create a 6 GB swap file before or after moving JP from the microSD car to the SSD?
TCIII
@TCIII Yes, booting from SSD
@Ahrovan Re: "just for test, donkey ui, don't work properly in JP502 @TCIII @DocGarbanzo"
Is this the error that you are seeing when you run donkey ui
on the CLI:
WARNING:kivy:stderr: ModuleNotFoundError: No module named 'kivy._clock
TCIII
@TCIII At first yes; But later, some buttons did not work in the ui. More related to "pilot arena" tab
@Ahrovan,
How did you resolve the error and have donkey ui load successfully?
TCIII
@TCIII No. need to check later, donkey ui load, but "pilot arena" tab had problems.
@Ahrovan,
Could you please activate the donkey env, run conda list
, and report what are the kivy-jetson
and kivy-garden
versions?
TCIII
The donkeycar docs say that only jetpack 4.5.1 is currently supported: https://docs.donkeycar.com/guide/robot_sbc/setup_jetson_nano/
I would be interested, what the issues with newer jetpack versions such as 4.6 or 5.1 are and which steps would be necessary to get it working with the latest jetpack versions?