fudan-zvg / RoadNet

[ICCV2023 Oral] RoadNetworkTRansformer & [AAAI 2024] LaneGraph2Seq
MIT License
68 stars 5 forks source link

Inference code for lanegraph2seq seems wrong #6

Closed ZJWang9928 closed 7 months ago

ZJWang9928 commented 8 months ago

Hi! I found that the test/inference code in RoadNetwork/rntr/ar_lanegraph2seq.py is the same with the test/inference code in RoadNetwork/rntr/ar_rntr.py, which is for RNTR instead of Lanegraph2Seq. Would you please update the test/inference code for Lanegraph2Seq? Thank you so much!

VictorLlu commented 8 months ago

I'm currently addressing the dataloader timing issue. If necessary, I can prioritize adding the inference code within the next few days.

ZJWang9928 commented 8 months ago

@VictorLlu Thank you for your update and reply. Can you please also add the code for performance evaluation recently? BTW, when running the training code, I print time for data preprocessing (time1), forward propagation (time2) and back propagation (time3) in mmengine/model/wrappers/distributed.py when training with 8 GPUs. It seems that the major cause for the long average batch time is the extremely slow back propagation in some iterations. image You can refer to #3 for details. I hope this can to some extent help you.

VictorLlu commented 7 months ago

Thank you for your contribution. I've implemented the inference code for AR-RNTR first. You can utilize test.py, or it will automatically perform an evaluation after every 50 epochs.