Open eduardolagobatista opened 3 years ago
Hi @eduardolagobatista Thank you for your interest! I have updated the readme file to include google drive links to the ego motion features I extracted from ORBSLAM2.
Thank you for the files. With them I was able to train the model. Just to confirm, from what I understood, in order to run the trained model in a custom dataset, like A3D, we have to follow these steps: 1- Generate detection files (bounding boxes) using Mask R-CNN repository & Use the detection files to generate tracking files using Deep_Sort repository; 2- Generate optical flow files using FlowNet2.0 repository; 3- Generate ego_motion files using ORBSLAM2 repository. 4- Run run_fol_for_AD.py and run_AD.py
Am I missing any step? Thank you very much!
@eduardolagobatista hi ! thanks to quite clear procedure is it works now ? Same as you , i gonna training my custom dataset
Hi @MoonBlvd, I mainly want to run the model for evaluation purposes. I am not sure, how to use the pkl-Files you provided. The fol_ego_train.yaml file wants these parameters to be set:
data_root: "/media/DATA/HEVI_dataset/fol_data" ego_data_root: "/media/DATA/HEVI_dataset/ego_motion" checkpoint_dir: "checkpoints/fol_ego_checkpoints"
best_ego_pred_model: "checkpoints/ego_pred_checkpoints/epoch_080_loss_0.001.pt" test_dataset: "taiwan_sa" #"A3D" #"taiwan_sa" test_root: #"../data/taiwan_sa/testing" #"/media/DATA/A3D" #"/media/DATA/VAD_datasets/taiwan_sa/testing" AnAnAccident_Detection_Dataset label_file: '../data/A3D/A3D_labels.pkl'
Which path needs to point to which of the data folders you provided? What is the label_file parameter?
Alternatively, could you also provide the pre-trained weights so that I do not need to train at all?
@eduardolagobatista hey, have you generated features using mask rcnn to orbslam2 ?
Good morning.
First of all, thank you for your work. I am trying to run the training procedure and I checked that for train_ego_pred.py the dataset is built over the pickle files you provided. But I noticed that for running train_fol.py it's necessary the ego_motion numpy files, that are imported inside the HEVIEgoDataset() class. What are these files? Are they outputs from the ego_pred or are they part of the HEVI dataset? (I can't require the dataset since I'm not from USA). Can we get them from the picke files and save it as numpy files?
Thank you very much.