lsa-pucrs / acerta-abide

Deep learning using the ABIDE data
GNU General Public License v2.0
45 stars 25 forks source link

Question about cross-validation #13

Open beautifulnight opened 4 years ago

beautifulnight commented 4 years ago

Sorry to bother you! I am wondering did you train a MLP for each fold? If there were 10 MLP models for 10 folds, why the models could fit well on the entire dataset? I am really confused about it. Looking forward to your reply, thank you!

anibalsolon commented 4 years ago

Hi,

for each fold, we trained the MLP with [dataset - fold], and computed the accuracy for [fold]. We reported the average accuracy of the 10 folds. I hope it makes things clearer.

beautifulnight commented 4 years ago

Thanks for your quick reply. I'm a newbie to machine learning so maybe I am wrong. I read the code and found you pre-trained two AEs in every individual folds, and fine-tuned the MLP for every individual folds. It seems like you trained 10 independent MLPs and computed the mean value of accuracy of those 10 mlps as average accuracy. My question is how can I pre-train AEs and fine-tune the model using the cross-validation method. I mean the parameters of the model should be updated constantly with each fold. Thanks again for your time.