autowarefoundation / autoware-projects

This repository is made to keep track of Non-source code related tasks for AWF projects
8 stars 0 forks source link

AWSIM feedback #67

Open WJaworskiRobotec opened 1 year ago

WJaworskiRobotec commented 1 year ago

An issue to post the results / problems / suggestions for improvements/fixes of AWSIM

xmfcx commented 1 year ago

@TakaHoribe has tested it and made a post in here:

I had made 3 tests.

System & test setup

CPU: 5900X (12c24t) GPU: RTX 3090 (24GB VRAM) Memory: 64GB

I've used the commits:

repositories:
  core/autoware.core:
    type: git
    url: https://github.com/autowarefoundation/autoware.core.git
    version: 99891401473b5740e640f5a0cc0412c0984b8e0b
  core/autoware_adapi_msgs:
    type: git
    url: https://github.com/autowarefoundation/autoware_adapi_msgs.git
    version: fcdd3d07814b91090cd60079bd378396da6924de
  core/autoware_common:
    type: git
    url: https://github.com/autowarefoundation/autoware_common.git
    version: 58039c8b598d7e247994015d9c10c5b39e36b545
  core/autoware_msgs:
    type: git
    url: https://github.com/autowarefoundation/autoware_msgs.git
    version: 62f341d97515c81236a9a6700f02ca4e8df02304
  core/external/autoware_auto_msgs:
    type: git
    url: https://github.com/tier4/autoware_auto_msgs.git
    version: e1795354161ed54d66659366262578863bd2b862
  launcher/autoware_launch:
    type: git
    url: https://github.com/autowarefoundation/autoware_launch.git
    version: f5cd323f75dd8e5c9a210098970aa082b1be1a80
  middleware/external/heaphook:
    type: git
    url: https://github.com/tier4/heaphook.git
    version: ec260d7a7f06f29c04de2c846fa6d2158e770877
  param/autoware_individual_params:
    type: git
    url: https://github.com/autowarefoundation/autoware_individual_params.git
    version: 9b8b4064bf856dbd3f781a5ebd1f96135496878c
  sensor_component/external/nebula:
    type: git
    url: https://github.com/tier4/nebula.git
    version: 189d7e273aa66a528becac30024db5c96fa645b0
  sensor_component/external/sensor_component_description:
    type: git
    url: https://github.com/tier4/sensor_component_description.git
    version: a6a628186b124bc1af73c4f48a6dcd4b441391d3
  sensor_component/external/tamagawa_imu_driver:
    type: git
    url: https://github.com/tier4/tamagawa_imu_driver.git
    version: de4bf6be79aa2968cf2f62e0ebe1ff8a5797e6ad
  sensor_component/external/transport_drivers:
    type: git
    url: https://github.com/MapIV/transport_drivers.git
    version: d1e413482ca93c7618b005459a6a42e78e895442
  sensor_kit/external/awsim_sensor_kit_launch:
    type: git
    url: https://github.com/RobotecAI/awsim_sensor_kit_launch.git
    version: a1f5993407ffeb4abcf97a49cd1b4034768d97b4
  sensor_kit/sample_sensor_kit_launch:
    type: git
    url: https://github.com/autowarefoundation/sample_sensor_kit_launch.git
    version: 3cef1e4888a41001e4ebb3afbf63ae823e42e27d
  universe/autoware.universe:
    type: git
    url: https://github.com/autowarefoundation/autoware.universe.git
    version: e39c9282e259bab7f2342a736019ce07a9076377
  universe/external/eagleye:
    type: git
    url: https://github.com/MapIV/eagleye.git
    version: 82a4d060d622a426700f1bc83a3f265be5fa895a
  universe/external/llh_converter:
    type: git
    url: https://github.com/MapIV/llh_converter.git
    version: 07ad112b4f6b83eccd3a5f777bbe40ff01c67382
  universe/external/morai_msgs:
    type: git
    url: https://github.com/MORAI-Autonomous/MORAI-ROS2_morai_msgs.git
    version: 04f0a0b6a069fef62e0236189ce23d60abfe97f7
  universe/external/muSSP:
    type: git
    url: https://github.com/tier4/muSSP.git
    version: c79e98fd5e658f4f90c06d93472faa977bc873b9
  universe/external/ndt_omp:
    type: git
    url: https://github.com/tier4/ndt_omp.git
    version: 9a0877ac99cf873d9e984e4f1c485e537503fb7f
  universe/external/pointcloud_to_laserscan:
    type: git
    url: https://github.com/tier4/pointcloud_to_laserscan.git
    version: 948a4fca35dcb03c6c8fbfa610a686f7c919fe0b
  universe/external/rtklib_ros_bridge:
    type: git
    url: https://github.com/MapIV/rtklib_ros_bridge.git
    version: ef094407bba4f475a8032972e0c60cbb939b51b8
  universe/external/tier4_ad_api_adaptor:
    type: git
    url: https://github.com/tier4/tier4_ad_api_adaptor.git
    version: a58468edcdaf8698153c85e93fe23807edcb8187
  universe/external/tier4_autoware_msgs:
    type: git
    url: https://github.com/tier4/tier4_autoware_msgs.git
    version: fbc4871a7756bab7a99ba5ee0bd009861f2c67f3
  vehicle/external/pacmod_interface:
    type: git
    url: https://github.com/tier4/pacmod_interface.git
    version: 664a58db456e092659e5b25954fbf525593bf10d
  vehicle/sample_vehicle_launch:
    type: git
    url: https://github.com/autowarefoundation/sample_vehicle_launch.git
    version: 627068935b12ec6d6121e6a1b885e31d564b04c5

Test details

Test 1

0:50 Test start 0:57 Random stop 1:23 Unable to turn 2:18 Random stop 2:41 End of test due to NPC traffic jam

Test 2

1:38 Test start 2:04 Random stop 2:14 Random stop 2:29 Attempts to change lane, stops unexpectedly 2:56 Fails to go back to the old lane 3:07 Stuck waiting in the middle of the road 3:52 Stuck in the middle of the road without anything blocking the path

Test 3

1:08 Test start 1:19 Crashes to obviously incoming vehicle from right and stops after it happens 1:24 Stops before vehicle in front and keeps blocking the road 2:10 fails to turn the empty road curve 2:20 fails to turn the empty road curve 2:45 waits too long in walkway 3:17 stuck in narrow node

cc. @mehmetdogru @WJaworskiRobotec @armaganarsln @yukkysaito

Aysenayilmaz commented 1 year ago

I think the blind spots in the point cloud could be due to the positions of the lidars on the right and left. When I moved the lidars that were somewhat embedded inside the sensor kit a bit outward and ran it, there were no blind spots.

Youtube link in case the video doesn't work: https://youtu.be/GJT10ihmvRM

https://github.com/autowarefoundation/autoware-projects/assets/108546951/f22dd865-feee-417e-9936-4ef5427fce1e

Screenshot from 2023-08-23 15-38-36

Screenshot from 2023-08-23 15-38-48

WJaworskiRobotec commented 12 months ago

@xmfcx

I've made a lot of tests using AWSIM with 1 (top) and all 3 lidars on the Lexus Vehicle. I have no problem with performance,even on my laptop:

Regarding strange behavior of Autoware observed in your comment, I have noticed that the output of segmentation differs betwen 1 and 3 lidars (check the screenshots below). It can be seen that with 1 lidar, there is no segmentation result on the ground, however with 3, there are segmented points (green rectangle), surprisingly created from points from the top lidar. Looks like it happens on the intersection of lines from top and side lidars. However the calibration seems to be correct and those points are located ~perfectly on the road surface.

Position of the vehicle is exactly the same, sensors mounting poses too, the only difference is the number of enabled lidars.

If you have any questions, I can provide AWSIM binaries, more details on test cases etc.

xmfcx commented 12 months ago

@WJaworskiRobotec thanks for testing.

I've made a lot of tests using AWSIM with 1 (top) and all 3 lidars on the Lexus Vehicle. I have no problem with performance,even on my laptop

Could you add Graphy to the AWSIM project (import it and move the prefab to the scene root) and also record the runs with OBS so we can see the performance too?

# obs installation
sudo add-apt-repository ppa:obsproject/obs-studio
sudo apt install obs-studio

I am not convinced about the performance, on my desktop 3070 AWSIM was running at ~10fps, evident from the videos. It might look like it runs fine but I don't think you had 60fps on your laptop.

About the ground removing issue, I think that's a perception ground segmentation problem from here on. It doesn't seem to be playing well with multiple lidars.

Short summary of my new tests with multiple machines

Today I have made some more serious tests on 10Gbps connected double machine setup where Autoware running on 5900X and RTX3090 and AWSIM running on 5900X and RTX4080.

Some summary of my findings:

On awsim-stable branch, traffic light recognition works. I've made my new tests on this branch.

First I tried default single vlp16 lidar test. It worked fine but perception was struggling to work fine with so little points obviously. (No ground segmentation problem.)

Then activated side lidars and rerun the tests. Performance became choppy, I think something is wrong the ROS2 bridge or the way the data is transmitted, maybe timestamped. Because the combined point cloud sometimes lacked some lidars flashing on and off, very annoying to witness. (Also the ground segmentation problem is present as expected.)

Then made the rest of my tests by turning off the side lidars and making the top lidar into a VLS128. It had 0.2 horizontal resolution (I think this is in degrees.) It didn't have the flashing lidars issue anymore but the performance dropped to 20-30 FPS.

Then I cranked down the resolution to 0.4 degrees and it was running at 40fps, I've found it to be ok. And made the rest of my tests there. With single lidar, ground segmentation worked fine. Also perception part had enough coverage to work with.

I will make a more comprehensive report and share my findings tomorrow.

WJaworskiRobotec commented 12 months ago

@xmfcx

Regarding the performance : in today's test it was not my main focus, I just wanted to investigate the problems with Autoware driving behavior. As "I had no problems with performance" I meant that Autoware was able to drive, I haven't even checked FPSs for now.

Seems that we are aligned about the segmentation problem, so now we will focus on the performance evaluation. Unfortunately, I'm on a business trip next week so I cannot do it myself, but will assign someone from our team to continue this investigation.

msz-rai commented 11 months ago

@xmfcx cc @WJaworskiRobotec

Regarding point cloud flickering, we have found an issue in our LiDAR simulation plugin. The problem is that we synchronize a scene with our native library once per frame while LiDARs trigger in FixedUpdate of the simulation. FixedUpdate can be called multiple times per frame, if the frame rate is low. In this case, we are raytracing on the same scene changing only LiDARs position which may overlap with the body of the car.

We are working on a fix to this bug.

xmfcx commented 11 months ago

I have shared 7 tests made with multi pc setup, feel free to comment under each test: https://github.com/orgs/autowarefoundation/discussions/3813