carla-simulator / carla

Open-source simulator for autonomous driving research.
http://carla.org
MIT License
11.19k stars 3.61k forks source link

Server performance with multiple camera sensors #3836

Open fabioreway opened 3 years ago

fabioreway commented 3 years ago

CARLA Version: 0.9.10 OS: Linux Ubuntu 18.04

Hello, I am interested in using CARLA to create multiple camera sensors and render all their perspectives at a high resolution, for instance 1920x1232 pixels. For this reason, I have run some performance tests to assess the performance of the simulator according to the number of cameras. The client used to connect to the server and create the camera sensors was programmed in C++.

The tests were performed in a i9-9900K CPU@3.60GHz; 32GB RAM; RTX 2080Ti. The simulation was running on epic mode since the graphic quality is important.

At first, the server was running with 60 FPS with one client but no sensor. It immediately dropped to 30 FPS with the creation of one camera sensor. With two camera sensors, it dropped to 20 FPS. And with three camera sensors, approximately to 15 FPS. When the desired number of 4 camera sensors was include on the simulation the FPS dropped to around 13 FPS.

I also experimented reducing the sensor resolution and increasing the number of camera sensors in a way that the total number of generated pixels covered the same area as for 4 cameras. For instance: 4x1920x1232 = 16x960x616. So, 16 cameras were created with a resolution of 960x616 and the server dropped to 7 FPS. Then, with 64 cameras at 480x308, to 2 FPS. The FPS measurements were done with a static ego and no other actor in the scenario (Town03). The results can be seen on the graph below:

01_fps

We observed that the GPU memory usage (%) tend to increase as the number of cameras at high resolution increases. However, drastically increasing the number of camera sensors does not result in a higher consume of GPU resources, despite achieving the worst FPS in these scenarios. In fact, the highest number of cameras (64) with the lowest resolution consumed the least GPU memory and demanded about 70% of the GPU usage. Meanwhile, the four cameras at high resolution only consumed only 25% of GPU memory, but demanded about 90% of the GPU usage. These results can be seen on the graphs bellow:

06-gpu-graph-mem

The average CPU usage (%) remained basically constant, no matter the resolution or the number of cameras. The RAM memory usage (%) varied slightly, achieving the maximum with the four cameras at high resolution.

07-cpu-mem

Having these results presented, I would like do discuss a few things:

  1. Does this limitation strictly rely on the Unreal Engine implementation?
  2. What could be done to achieve higher FPS with high resolution cameras (at least with 4)?

I would appreciate your feedback.

Thank you, Fabio Reway Maikol Drechsler

qhaas commented 3 years ago

Another thing you might have to look out for is IO bottlenecks when saving allot of data to disk. We tried running several captures on a local non-SSD (i.e. magnetic drive) and a (remote) NFS share and ran into bottlenecks.

fabioreway commented 3 years ago

Another thing you might have to look out for is IO bottlenecks when saving allot of data to disk. We tried running several captures on a local non-SSD (i.e. magnetic drive) and a (remote) NFS share and ran into bottlenecks.

I'm not saving the image data to the disk, but streaming it.

JimmysCheapJeep commented 3 years ago

Hello!

we noticed a similar behaviour, when using multiple RGB cameras (3). The server frame rate drops to 5 FPS, every camera "costs" 15 frames/sec. :(

When additional traffic actors are spawned, the situation gets even worse. We are not saving data to disk, they are grabbed by the C++-API and streamed to a network. The simulation is also running in asynchronous mode....

We use a RTX3070, which is utilized by 25%. (edit: same on a RTX 2060, usage is a little bit more, but much unused...) The i9-10 Core CPU is utilized by 75%. (edit: its more 50%)

Carla keeps much of the resources unused...

Any ideas?

Regards, Chris

corkyw10 commented 3 years ago

@bernatx could you have a look at this please?

Karthikeyanc2 commented 3 years ago

Hello!

I am also experiencing a similar problem with LiDAR sensors.

I am using an RTX 5000 GPU. Carla Version: 0.9.10

Scenario and settings: Without spawning any vehicles and adding LiDAR sensors to the infrastructure (all sensors are added through a single client). Synchronous mode with fixed delta seconds as 0.1

Server FPS (appx): With 0 LiDARs: 80 fps With 1 LiDARs: 55 fps With 4 LiDARs: 35 fps With 3 LiDARs: 21 fps With 4 LiDARs: 15 fps

GPU Utilization (appx): With 0 LiDARs: 35% With 1 LiDARs: 29% With 4 LiDARs: 20% With 3 LiDARs: 15% With 4 LiDARs: 12%

Carla is not utilizing the resources completely.

Any ideas?

Regards, Karthik

JimmysCheapJeep commented 3 years ago

Hi! Is there a way to support you analyzing the problem? We would like to use the simulation as a "real-time" application and would like to achieve a higher framerate... What framerates can be achieved?

JimmysCheapJeep commented 3 years ago

We found that the performance running carla on Ubuntu 20.04 is better, but the main problem still exists....

stale[bot] commented 3 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

brscholz commented 2 years ago

Hi, we're experiencing similar issues and found out using Wireshark, that at high data bandwith the boost TCP communication is the bottleneck, especially when running in sync mode. The larger the data, the more packages need to be sent - we think that shared memory communication could help tackling this issue.

PS: Our use case involves a modified rgb camera sending 4K float images to be displayed on a Dolby Vision TV, so we're really sending tons of packages.

stale[bot] commented 2 years ago

This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.

JosenJiang commented 1 year ago

Hi, we are also experiencing a similar problem. Following results is tested with carla/PythonAPI/util/performance_benchmark.py in a computer with Ubuntu 18.04, i9-11th, 64G RAM, RTX-3080ti. 图片

dongkeyan commented 1 year ago

Hi,we are also experiencing similar problem.Is there any update information?

JosenJiang commented 1 year ago

@dongkeyan Carla 0.9.14 has be published. It add multi-gpu feature, which is very helpful for rendering performance.

kristjard commented 1 year ago

@JosenJiang would you be willing to post some statistics on multiple GPU usage? I am planning to purchase some extra GPU-s for multiple RGB sensor rendering. What sort of FPS are you getting with different sensor setups? 2, 3, 4, 5, 6 cameras. Thank you.

brscholz commented 1 year ago

@JosenJiang That would be very interesting for me, too!

zzj403 commented 11 months ago

same question. I wonder whether can carla use GPU up to 100% utils to increase performance?