Closed Yingjuan closed 1 year ago
Hi, I observed the same behavior! Have you figured out why? @Yingjuan
No, unfortunately, I didn't figure it out. @chenyugoal
Thanks for your reply! @Yingjuan
Thank you for your interest in our project. We have double-checked the model and code, and did not encounter the issue of "getting similar second-stage outputs for the same discretized age input." However, we have made improvements regarding the data standardization concern raised by Yingjuan last time, and updated the model's pre-training weights. We appreciate Yingjuan's contribution. Additionally, we have provided some test samples that can be used for result verification. You can download the data from the following link and make inferences on the sample data. Test Sample link: https://bhpan.buaa.edu.cn/link/AA6A7985090E3340ADA0EA3A0E86BF84F5
Dear authors, I have preprocessed my data using the provided script fsl_anat_tsan and attempted to predict age using your pre-trained models by running the command "bash script/bash_test_second_stage.sh". In order to inspect the predicted age for each participant, I inserted print commands into the /TSAN/prediction_second_stage.py script as follows: print("first_stage_predict:", first_stage_predict.cpu().numpy()) print("dis_age:", dis_age.cpu().numpy()) print("output_age:",output_age.cpu().numpy())
However, I noticed that the final predicted ages are the same for different participants if they have the same discretized age after the first stage. Consequently, there appear to be only a few distinct output options. Here is an example output of one batch containing 8 different images:
first_stage_predict: [[86.576164] [69.764824] [66.17979 ] [86.32469 ] [65.79527 ] [56.986835] [83.63843 ] [72.029015]] dis_age: [[85.] [70.] [65.] [85.] [65.] [55.] [85.] [70.]] output_age: [[84.99657] [70.02373] [65.03278] [84.99657] [65.03278] [55.05089] [84.99657] [70.02373]]
I would like to understand if this behavior is expected or if there are any suggestions on how to address this issue. Thank you for your help!