egemenkopuz / temporal-bevfusion

Master's thesis research on 3D object detection using LiDAR and Camera data for infrastructure and railway domains, emphasizing inference optimization and utilization of temporal information for distant and occluded objects.
11 stars 1 forks source link

missing “a9_tokenize.py” #12

Open Bin-Go2 opened 3 months ago

Bin-Go2 commented 3 months ago

Thanks for the fantastic work. When I tried to learn your work by re-implementing the code, finding that the command

python tools/preprocessing/a9_tokenize.py --root-path ./data/tumtraf-i-no-split --out-path ./data/tumtraf-i-no-split --loglevel INFO

doesn't work. Perhaps you missed post the a9_tokenize.py. If possible, would you like to share it ?

Thanks you!

egemenkopuz commented 3 months ago

it should be tumtraf_tokenize.py rather than a9_tokenize.py. Sorry for the typo in readme file.

Bin-Go2 commented 3 months ago

aha, thanks for your quick reply! I will try it now.

Bin-Go2 commented 3 months ago

Sir, sorry to bother you again. I am wondering if there are labels_images folder under the TUMTraf-I DataSet. The dataset that I downloaded only have 3 folders as the repo(https://github.com/tum-traffic-dataset/tum-traffic-dataset-dev-kit) describes

├── tum_traffic_intersection_dataset_r02_s01 │ ├── images │ │ ├── s110_camera_basler_south1_8mm │ │ ├── s110_camera_basler_south2_8mm │ ├── labels_point_clouds │ │ ├── s110_lidar_ouster_north │ │ ├── s110_lidar_ouster_south │ ├── point_clouds │ │ ├── s110_lidar_ouster_south │ │ ├── s110_lidar_ouster_north

and your code under tools/preprocessing/tumtraf_tokenize.py line 93 img_label_s1_folder_out = os.path.join(out_path, split, "labels_images", "s110_camera_basler_south1_8mm")

Could you please clarify my doubts?

egemenkopuz commented 3 months ago

It might be that they revised the dataset structure, and the 'labels_images' folder is no longer included. I suppose there needs to be refactoring for these scripts... The thing is, models in this codebase don't require labeled images (2D bounding boxes, etc.). I included that folder for the sake of maintaining the same dataset structure while splitting it.

I will have a look at these scripts sometime soon. I will comment here if there are any updates. In the meanwhile, you are free to comment out such parts, but it might be a little tricky to do so, especially for the other scripts.

Bin-Go2 commented 3 months ago

Oh, gotta it. I just thought the labeled images have something important. Thanks for your reminding!

Bin-Go2 commented 3 months ago

Excuse me, sir. I am in the process of the final step of data preparation. But I encountered an error saying

KeyError: 'TUMTrafIntersectionDataset is not in the dataset registry'

I am wondering if you met the problem before and hope your advice.

(Btw, I didn't use your docker to build environment since my server cannot access the network, I just use an environment I created before. I am not sure if this is the reason.)

egemenkopuz commented 3 months ago

you probably need to "install" the local repo by using the commands as defined in the makefile link. You can see there are "install-pkgs" and "install-dev"/"install-prod" sections, you basically run them in your python environment.

lacie-life commented 2 weeks ago

I met the same problem: KeyError: 'TUMTrafIntersectionDataset is not in the dataset registry'. And I used a docker build. Could you let me know if you've fixed it yet? @Bin-Go2