Open yoo0 opened 6 years ago
However, it seems the age estimation accuracy on validation set is quite low and seems only attend 0.04. But would the model predict better if I reduce the class number?
Yes, the accuracy is very low because it is calculated as a classification problem. However, in prediction, the estimated age is calculated as an expected value instead of the age with maximum probability. Then, mean square error (MSE) or mean absolute error (MAE) is used as a performance indicator. Reducing the number of classes would increase apparent accuracy, but MSE or MAE would decrease.
Overfitting may be reduced by augmentation. Please check Results with data augmentation
section.
Thanks for your reply! I met some difficulties that I re-run the code and found out the pred_acc of age is 0. It is weird so I generate the dataset (imdb) again but it seems remain the same. I'm confused now and I post the log screenshot below. Thanks a lot!
It is difficult to guess what's going on from the above information. The pred_age seems to be always the same value... What's the exact procedure for reproduce the problem?
I changed the input into 0-1 and this problem solved. What's more, the result seems better than the first time I use the original 0-255 image as input. I'm not sure if you ever met this situation before but it is worth a try. Anyway, really thank you for your code. It is written clearly and thank you for your help! Much appreciated.
Thank you for your information! There seems to be room for improvement in normalizations for input images...
Hi ! I met the same problem that the pred_acc of age is 0 all the time. I changed the input into 0-1 and it did solve the 0 problem, but after training with data augmentation for 30 epochs, I only achieved 4.64 val_loss. I was wondering if you have some solutions. Btw, without data augmentation, I achieved 3.97 val_loss after training for 30 epochs, so I really don't know why it happened. Thanks!
Thank you for your report. Would you let me know the procedure to reproduce the above experiments? Especially, I want to know the database, training image size, and model parameters.
In terms of augmentation, random erasing might cause problem because it assumes the input range of [0. 255].
Changing the following line in train.py
preprocessing_function=get_random_eraser(v_l=0, v_h=255))
to
preprocessing_function=get_random_eraser(v_l=0, v_h=1))
may fix the problem.
Thanks for your reply! I trained on IMDB-WIKI dataset with image size 64, and I simply follow the default settings as you presented (width 8, depth 16, batchsize 32). I retrained the model with the same settings and this time the val_loss decreased to 3.88. I believe modify the get_random_eraser function will help and I will try it later. However, there is a new problem that when I evaluated the model (val_loss 3.97) on APPA-REAL dataset, it only achieved about 15 MAE for both apparent and real age estimation, which is far less than the results you presented. Is it due to the normalization?
Yes, the evaluation script for the APPA-REAL dataset assumes inputs ranging from 0 to 255.
https://github.com/yu4u/age-gender-estimation/blob/master/evaluate_appa_real.py#L54
If you used normalized images for training, you should do the same thing in test time like (I did not test the code);
faces[i % batch_size] = cv2.resize(cv2.imread(str(image_path), 1), (img_size, img_size)).astype(np.float32) / 255
Hi, I really appreciate your neat code and it works perfectly. However, it seems the age estimation accuracy on validation set is quite low and seems only attend 0.04. I guess it is due to the number of classes. But would the model predict better if I reduce the class number? In addition, I re-checked your accuracy curve and observed that it might overfit when epoch over 25. Thanks