wzzheng / OccWorld

[ECCV 2024] 3D World Model for Autonomous Driving
https://wzzheng.net/OccWorld/
Apache License 2.0
382 stars 25 forks source link

About Evaluating #11

Closed VitaLemonTea1 closed 8 months ago

VitaLemonTea1 commented 11 months ago

Hi, Thanks for your help. I just finished training with 4 x 3090. After evaluating, I see that the result has a big different from yours. I wander if you can tell me where can I change the tokenizer settings and other hyperparameters. Thanks!

liuziyang123 commented 11 months ago

Hi, I have the same question. The VQ-VAE was trained for 200 epochs on 8 v100 GPUs. The batch size is 1 for per GPU. After training I got the results in the following: 20231218-113927 Then for the second stage, the checkpoint of VQ-VAE (epoch_200.pth) was loaded. The OccWorld was trained for another 200 epochs on 8 v100 GPUs. After training I run the eval_metric_stp3.py and got the final results: 20231218-113932 However, the result seemed to be different from that announced in the paper: screenshot-20231218-114551

dk-liang commented 11 months ago

I have the same question. Could the authors provide the training log and testing log? Thanks!

chen-wl20 commented 8 months ago

Thank you for your interest! I'm very sorry for replying so late. I have provided the training and testing logs for the github repository code. The results maybe a bit different due to random numbers. I also provide the model used in the article and its evaluation log. Also, recommend selecting the best vqvae model for the occworld training.