ktro2828 / perception_eval_rs

MIT License
1 stars 0 forks source link

[TODO] add support of loading config from scenario #7

Open ktro2828 opened 1 year ago

ktro2828 commented 1 year ago

What

Add support of loading PerceptionEvaluationConfig from scenario file in yaml format. As a first step, we start from support of the scenario file like as below.

ScenarioFormatVersion: 3.0.0
ScenarioName: sample_perception_evaluation
ScenarioDescription: sample evaluation for perception
SensorModel: sample_sensor_kit
VehicleModel: sample_vehicle
Evaluation:
  UseCaseName: perception
  UseCaseFormatVersion: 0.4.0
  Datasets:
    - tests/sample_data:
        Version: annotation
        VehicleId: default # Specify VehicleId for each data set.
        LaunchSensing: false # Specifies whether the sensing module should be activated for each dataset. if false, use concatenated/pointcloud in bag
        LocalMapPath: $HOME/autoware_map/sample-map-planning # Specify LocalMapPath for each data set.
  Conditions:
    PassRate: 99.0 # How much (%) of the evaluation attempts are considered successful.
  PerceptionEvaluationConfig:
    evaluation_config_dict:
      evaluation_task: detection
      frame_id: base_link
      target_labels: [car, bicycle, pedestrian, motorbike]
      max_x_position: 100.0
      max_y_position: 100.0
      min_point_number: 0
      target_uuids: null
      center_distance_threshold: 1.0
      plane_distance_threshold: 2.0
      iou_2d_threshold: 0.5
      iou_3d_threshold: 0.5

This is a minimum implementation of scenario, and also a little bit different from original one. Now I am working on this in https://github.com/ktro2828/perception_eval_rs/tree/feat/config-from-yaml.

TODO list

ktro2828 commented 1 year ago

Finish foundamental implementation. The following is display of result loaded PerceptionEvaluationConfig from scenario file in https://github.com/ktro2828/perception_eval_rs/commit/f027815e215703a0471a5341584d92eadce17a5d

It is possible to confirm the following output with examples/config.rs

Config: PerceptionEvaluationConfig { version: "annotation", dataset_path: "tests/sample_data", evaluation_task: Detection, frame_id: BaseLink, result_dir: "./work_dir/20230531_035900", log_dir: "./work_dir/20230531_035900/log", viz_dir: "./work_dir/20230531_035900/visualize", filter_params: FilterParams { target_labels: [Car, Bicycle, Pedestrian, Motorbike], max_x_positions: [100.0, 100.0, 100.0, 100.0], max_y_positions: [100.0, 100.0, 100.0, 100.0], min_point_numbers: Some([0, 0, 0, 0]), target_uuids: None }, metrics_params: MetricsParams { target_labels: [Car, Bicycle, Pedestrian, Motorbike], center_distance_thresholds: [1.0, 1.0, 1.0, 1.0], plane_distance_thresholds: [2.0, 2.0, 2.0, 2.0], iou2d_thresholds: [0.5, 0.5, 0.5, 0.5], iou3d_thresholds: [0.5, 0.5, 0.5, 0.5] }, load_raw_data: false }