UniModal4Reasoning / ChartVLM

Official Repository of ChartX & ChartVLM: A Versatile Benchmark and Foundation Model for Complicated Chart Reasoning
Creative Commons Attribution 4.0 International
189 stars 17 forks source link

How many epochs and what is the best loss that you trained your model? #12

Open nguyenquangtan opened 2 months ago

nguyenquangtan commented 2 months ago

Hi, I am trying to replicate the training procedure on ChartQA, PlotQA, Chart2Text, SimChart9K as described in your paper. It's my first time training such a big model so I don't know when I should stop the training loop. Could you please kindly provide the information about the number of epochs and the best loss that you trained your model with these 4 datasets.

Thank you for your attention to this matter.

renqiux0302 commented 2 months ago

Hi @nguyenquangtan, We have trained 50 and 5 epochs for the base decoder and auxiliary decoder respectively.

nguyenquangtan commented 2 months ago

Thank you for your information. May I ask what the best losses of both models are? I saw that you use the Cross-entropy loss. As our training data is different, the information about the best losses is really helpful for us to somehow determine the upper bound, therefore, knowing when is the right time to stop the training loop.

Road2Redemption commented 4 days ago

I also want to know the losses of both models. Have you guys figured it out? Thanks! I tried to train several epochs but the result were bad :(