NVIDIA-AI-IOT / jetbot

An educational AI robot based on NVIDIA Jetson Nano.
MIT License
3.03k stars 1.03k forks source link

Suggestion on improving JetBot/Jetson Nano performance #182

Closed stevenlee090 closed 2 years ago

stevenlee090 commented 4 years ago

Hi All,

I have recently got a JetBot kit and was able to get assembled and software installed. The examples run fine, but just seem a bit too sluggish compared to the videos linked in the Github Examples page.

For reference, I followed the instruction from https://github.com/NVIDIA-AI-IOT/jetbot/wiki/software-setup and setup everything using the jetbot_image_v0p4p0.zip.

However, I am not sure if it is the Jupyter Lab/Notebook which are slow or my routers or something with the JetBot, it just seems like there is significant delay (~1 second) in the camera video stream and also the control responsiveness of the JetBot.

Consequently, it is unable to avoid obstacles reliably, nor track an object reliably. For obstacle avoidance, it often bumps into the obstacles first, then react 1-2 seconds later and starts turning left. For object tracking, the gain had to be tuned to be much lower in order to avoid overshoot.

Therefore, I was wondering if there is anything I can try to help improve the performance of the JetBot / Jetson Nano. Any advice or suggestions is appreciated.

kevindgoff commented 4 years ago

The latency issues improved for me once I switched to a Wi-Fi module vs a usb version. there is still a slight delay but the bot reacts better now.

stevenlee090 commented 4 years ago

@kevindgoff thanks for the feedback. I actually have the Intel wifi module which is recommended in the bills of materials for the JetBot. Someone else mentioned to me that it could be the SD card which is the processing bottleneck for me, but I haven't had the chance to try out a faster card yet.

Have you tried different SD cards with your Jetson Nano by any chance?

stevenlee090 commented 4 years ago

Upgraded my SD card to a UHS-I class one, but the demo programs still feel kind of sluggish. Not sure what I was missing.

dmccreary commented 4 years ago

I am having the same issues. Perhaps Jupyter and Python is just too slow for real-time robotics. I am also evaluating the DonkeyCar and it seems much, much faster!

jaybdub commented 4 years ago

Hi All,

Thanks for reaching out and sharing your experience!

I have not personally witnessed this in my testing, it could be some regression in our latest v0.4.0 JetBot image. When developing JetBot, we noticed that even when network latency was bad (video not streaming to Image widget), the robot was still able to respond in real time. Certainly not 1-2 seconds, I feel there is a different issue.

Do you mind sharing the following information

Hope I can help get past this!

Best, John

stevenlee090 commented 4 years ago

@jaybdub Thanks for getting back to this issue.

Below is the information for my use case:

So in my case, there is some hardware variation being that the motor and motor driver are not the offiical Adafruit version, but I doubt this is the root cause (not ruling it out at the moment though).

Thanks for mentioning about flashing an earlier version, this is indeed worth a shot. I will try that once I finish my current development on the JetPack image, and will try to re-flash to the earlier version.

stevenlee090 commented 4 years ago

@dmccreary I still feel like Jetson Nano is capable enough to do real-time robotics. It just depends on the machine learning models you are trying to deploy and how well your program is optimized. For instance, the Nano is capable of reaching 10-20 fps with YOLOv3-tiny which is really quite impressive for such a small form factor.

Had a quick look at the donkey car, it does seem pretty cool!

StefanW0815 commented 4 years ago

I had the same problem: JetBot was simply not able to follow any kind of road, even under best optical conditions like bright "street" on dark floor. I mixed some code from the data collection to display the target direction marker and you could see that it wanted to go to the center of the street. But it constantly overshot because of lag in the system. There are also several seconds lag between scene in front of jetbot and PC update. When a monitor is connected, the task manager of the jetbot shows both CPUs running close to 100%. The jetbot simply doesn't seem to be able to handle the load. I set the camera to 10Hz and the CPU load dropped a bit and at least the lag reduced. But the final solution was getting a fan and bumping up the power to max. With all 4 CPUs running the load dropped to ~50%. But the road follow example did only work smooth when the camera was set to 10fps on top of max power. Just to add, my laptop is a brand new Alienware which should not be a bottleneck. I run Ethernet via cable and created a mobile hotspot with the jetbot directly communication to the laptop. This way there is no interference with any WiFi traffic.

xiepan610 commented 4 years ago

I had the same problem: JetBot was simply not able to follow any kind of road, even under best optical conditions like bright "street" on dark floor. I mixed some code from the data collection to display the target direction marker and you could see that it wanted to go to the center of the street. But it constantly overshot because of lag in the system. There are also several seconds lag between scene in front of jetbot and PC update. When a monitor is connected, the task manager of the jetbot shows both CPUs running close to 100%. The jetbot simply doesn't seem to be able to handle the load. I set the camera to 10Hz and the CPU load dropped a bit and at least the lag reduced. But the final solution was getting a fan and bumping up the power to max. With all 4 CPUs running the load dropped to ~50%. But the road follow example did only work smooth when the camera was set to 10fps on top of max power. Just to add, my laptop is a brand new Alienware which should not be a bottleneck. I run Ethernet via cable and created a mobile hotspot with the jetbot directly communication to the laptop. This way there is no interference with any WiFi traffic.

Hi! Could you tell me how to set the camera to 10fps ?

StefanW0815 commented 4 years ago

camera = Camera()

camera = Camera.instance(width=224, height=224, fps=10)

I replaced the Camera instantiation in data_collection and in live_demo to get a reasonable update.

xiepan610 commented 4 years ago

@StefanW0815 Thank you very much!I‘ve also receive your E-mail.

tomMEM commented 4 years ago

One possibility would be to try the different power modes of the jetson as previously mentioned: sudo /usr/sbin/nvpmodel -q - will show current power mode sudo jetson_clock --show - will show current power mode with more details Output should be: GPU MinFreq=76800000 MaxFreq=921600000 CurrentFreq=921600000 CPU0-3 active: MaxFreq=1479000 CurrentFreq=1479000 If not then try: sudo /usr/sbin/nvpmodel -m 0 or 1 or sudo jetson_clocks.

tomMEM commented 4 years ago

Update to my previous message: sudo jetson_clocks to get MaxFreq is only possible if nano is connected to the 4A input from a respective power bank (e.g. Oppo).

However, whereas it helped to have MaxFreq in the collision_avoidance example, the problem with a time lag in object_following example remained. The delay (1-3sec) comes from the line "detections = model(image)" in the object_following live_demo script (last JetBot SD image May 2020, B01 nano). Thus, the object recognition seems to have lost on speed in a newer version. Would be great to get additional suggestions. Best. T

abuelgasimsaadeldin commented 4 years ago

Hi StefanW0815,

you mentioned that you mixed some codes from the data collection to display the target direction marker while running the live demo, may I please know what exactly are these codes? I want to know if my training model is the problem for my road following not achieving the expected results or it is something else and I am really new to coding in general. Thanks in advance.

tomMEM commented 4 years ago

Hello, you could add to the camera cell the code of the "def displayxy() from data_collection of road following project.

I added to the script at https://github.com/tomMEM/Jetbot-Project live_demo_roadfollowing_targetdisp.ipynb and also added joystick movement and coordinates taking over to data_collection_joystick_roadfollowing.ipynb .

Using more images for training and avoiding "horizon" as well using joystick coordinates for training improved road following.

For more complex road following systems, you might have a look at https://atsushisakai.github.io/PythonRobotics/ "Path tracking" and the corresponding docs in https://pythonrobotics.readthedocs.io/en/latest/

abuelgasimsaadeldin commented 4 years ago

Hi tomMEM,

thanks for the reply I managed to make my road following example work by adjusting the speed of the motors as the reason for not achieving the expected results was found to be due to the motors not operating at the same speed, but still it will be nice to see the target direction marker while running the live demo.

Now I am trying to combine this road following together with collision avoidance, not sure if there are available codes online for that already and if so it would be very helpful indeed.

tomMEM commented 4 years ago

Hello Abuel, I tried to implement the three - collision, object and road in one notebook. Besides a performance problem the main problem is the different camera angles that are required for the tasks (collision and object the same, road different). It seems to be indeed a good project for two cameras, or half of the camera chip needs to be blanked (code based) for the road following training and live. Did you find a solution? Best.

calleliljedahl commented 3 years ago

The latency issues improved for me once I switched to a Wi-Fi module vs a usb version. there is still a slight delay but the bot reacts better now.

thanks - will try