Closed beaverSun-dotcom closed 1 year ago
Thank you for your interest! This is really weird. Did you check if the concept loaders used in line 230 of train_places.py are loading the correct concept images? Also, in line 251 of MODELS/iterative_normalization.py, it prints the objective value of the gradient descent under orthogonality constraint, how does the objective value change on your end during training?
Thanks for your reply!
I virtualised the picture in concept loaders to confirm the concept, and the result shows that concepts are all correct, below are the picture of concept airplane, bed, and person.
And then I trained for two epochs(whiten layers=8), the change of objective value is confusing and unsteady, in epoch 1 the value first asend and then decend , in epoch 2 the value first decend and then ascend, the results are listed below.
And the results of testing still remain the same.
Thanks again for your generous help!
Beaver
This closed issue seems to be relevant (very similar problem) https://github.com/zhiCHEN96/ConceptWhitening/issues/9. You might want to take a look. Also, have you ever made any change to the code? The problem in that issue is because that researcher made some changes to the code and used the wrong dataloader.
Another important thing to check is if you load the correct model weights. One simple thing you can do is to print(checkpoint['best_prec1']) after loading the checkpoint in line 1502 of plot_functions.py.
If this still doesn't resolve your issue, feel free to schedule a short meeting with me by email!
Hi,thanks for your help, I redownload the code and run the code with sh command scripts that you provided, it worked perfectly! The reason may be too much change to the original code, anyway, thanks for your patience.
Hi, I admire your excellent work and try to reproduce some result.
I used the pretrained resnet18_places365.pth module and replaced the 8th layer of the module using cw_layer, then train for 2 epochs here is my training command:
python /media/D/beaver/ConceptWhitening-final_version/train_places.py --ngpu 1 --workers 2 --arch resnet_cw --depth 18 --epochs 2 --batch-size 32 --lr 0.05 --whitened_layers 8 --concepts airplane,bed,person --prefix RESNET18_PLACES365_CPT_WHITEN_TRANSFER /media/D/beaver/ConceptWhitening-final_version/data_256
and then I use the saved checkpoint and plot function to plot top 50 activated picture of each concept, here is my command:
python /media/D/beaver/ConceptWhitening-final_version/train_places.py --ngpu 1 --workers 2 --arch resnet_cw --depth 18 --epochs 2 --batch-size 32 --lr 0.05 --whitened_layers 8 --concepts airplane,bed,person --prefix RESNET18_PLACES365_CPT_WHITEN_TRANSFER --resume /media/D/beaver/ConceptWhitening-final_version/checkpoints/RESNET18_PLACES365_CPT_WHITEN_TRANSFER_8_checkpoint.pth.tar /media/D/beaver/ConceptWhitening-final_version/data_256 --evaluate plot_top50
However, although the training retult is similar to the result in paper(epoch2 * Prec@1 51.690 Prec@5 82.630), the result picture of each concept did not match the concept at all, whitch is weird.
After using your saved checkpoint, the plot result can be outstanding, so I guess plot function is fine, I want to ask have you ever encontered the same problem and how do you solve it.
Thanks for your time,
Your sincerely,
Beaver