Closed aoxolotl closed 2 years ago
Hi,
Just to make sure, you should be working with Argoverse 1.0 (since I have not tested 2.0). Also, make sure that you follow the instructions on installing the argoverse-api.
If all of this was done, then I am a little unsure what is causing the problem, as I am having trouble replicating the problem. It makes sense that the results you are getting are poor since there is not map information being provided. Can you give me more details on your setup?
Thanks,
Thank you for the quick response! I am using Argoverse 1.1 since 1.0 is marked as deprecated (see bottom of page here). Does the AutoBots paper use 1.0 dataset? Apart from this, I am using default training arguments:
python train.py --exp-id test --seed 1 --dataset Argoverse --model-type Autobot-Ego --num-modes 10 --hidden-size 128
--num-encoder-layers 2 --num-decoder-layers 2 --dropout 0.1 --entropy-weight 40.0 --kl-weight 20.0 --use-FDEADE-aux-loss True --use-map-lanes True
--tx-hidden-size 384 --batch-size 64 --learning-rate 0.00075 --learning-rate-sched 10 20 30 40 50 --dataset-path data/argoverse/
I also use create_h5_argo.py
provided in this repo.
Yes, 1.1 is correct. The training command is also correct, so that's something not to worry about. The issue is really with create_h5_argo.py
and I can't exactly tell where it's coming from for you. I just ran a fresh install on my laptop, i.e., downloaded the dataset files and HD-maps, installed argoverse-api
following the instructions on the linked GitHub from my previous message and ran create_h5_argo.py
without encountering any issues.
Can you provide more details on your setup with the argoverse data? What is the exact command you run for create_h5_argo.py
? What is the directory tree in Argoverse-api, and in your raw-dataset-path?
Thanks!
Hi, I had a look at my dataset files again and turns out I was pointing my scripts to a modified version. Thank you for your help in this! Since there is nothing wrong with the create_h5_argo.py
script, I am closing this issue.
Hi, Thank you for releasing your work to the open source community! I am trying to replicate results on the Argoverse dataset. While building
h5py
files I run into this error:This is happening when the script cannot find lane centerlines associated with the current
query_bbox
. I tried patching this by creating dummy data:But this doesn't give good results after training (
Val minADE: 1.621 minFDE: 3.636
). Is there a better way to handle the errors that crop up while building the dataset? Or do I need to look into changing the training set up?Thank you!