Open MLRadfys opened 1 year ago
Sorry for hardcoding it in the plot function. This is just an example of plotting the top50 images for a model whose layer '8' is CW. You should change it if the whitened layer is not '8'. Note that, when you change them, the whitened layer in the load_resnet_model function and the plot_concept_top50 function need to be the same, because we hope to visualize the CW layer not the other layers.
Hi and thanks again for the reply!
Sorry, I missed this part in your paper. The baseline is used when training with the auxiliary loss right?
Really appreciate your help,
Kind regards,
Michael
Den mån 9 jan. 2023 19:42zhiCHEN96 @.***> skrev:
Sorry for hardcoding it in the plot function. This is just an example of plotting the top50 images for a model whose layer '8' is CW. You should change it if the whitened layer is not '8'. Note that, when you change them, the whitened layer in the load_resnet_model function and the plot_concept_top50 function need to be the same, because we hope to visualize the CW layer not the other layers.
— Reply to this email directly, view it on GitHub https://github.com/zhiCHEN96/ConceptWhitening/issues/13#issuecomment-1376111710, or unsubscribe https://github.com/notifications/unsubscribe-auth/AEBWZ2GV6J3JUBVNXEBYYFDWRRLZ7ANCNFSM6AAAAAATVR5MUM . You are receiving this because you authored the thread.Message ID: @.***>
Yes, the baseline is training with auxiliary concept loss.
Hi and thanks for providing all code for your experiments!
I noticed that the whitening layer number is hardcoded in the plot function:
if args.evaluate == 'plot_top50': print("Plot top50 activated images") model = load_resnet_model(args, arch = 'resnet_cw', depth=18, whitened_layer='8') plot_concept_top50(args, test_loader_with_path, model, '8', activation_mode = args.act_mode)
Is it correct to assume that the values for arguments whitened_layer in load_resnet_model and plot_concept_top50 have to be changed depending on where the whitening layer was put during training?
Thanks in advance,
Best regards,
Mike