wyharveychen / CloserLookFewShot

source code to ICLR'19, 'A Closer Look at Few-shot Classification'
Other
1.13k stars 268 forks source link

params.num_class in Baseline/Baseline++ method #23

Open haohang96 opened 5 years ago

haohang96 commented 5 years ago

In training stage of Baseline/Baseline++ method, it seems that you always set num_class as 200. But every dataset has specific number of base class(for example, miniImageNet has 64 base class). Is it set to 200 for convenience(someone do not need to modify this parameter when conduct experiments both in CUB and miniImageNet)? But i think will it be better to set num_class as a optional command line argument or set num_class as 64 for miniImage and 50 for CUB?

GeLiu6 commented 5 years ago

I think this default parameter is just for convenience, but no influence in experiment results, and you can set num_class as the number of base classes in the command line. In my reimplementation, it is OK for setting num_class as 64 in the mini-ImageNet dataset. But running error when setting num_class as 100 in the CUB( in CUB, the number of base classes is 100). I found labels for base classes in the CUB are not continuous after preprocessed by the file(write_CUB_filelist.py), so an error occurs in the CrossEntropyLoss (labels should be 0-99 for 100-class classification). If you want to set num_class as 100 in the CUB, I suggest you relabel base classes to 0-99 before the pre-training.

haohang96 commented 5 years ago

Yes, I think so,I just think that set num_class is a little confused when someone read the code first time。Thanks for your response! 在2019年05月29日 19:50,LiuDagger 写道:

I think this default parameter is just for convenience, but no influence in experiment results, and you can set num_class as the number of base classes in the command line. In my reimplementation, it is OK for setting num_class as 64 in the mini-ImageNet dataset. But running error when setting num_class as 100 in the CUB( in CUB, the number of base classes is 100). I found labels for base classes in the CUB are not continuous after preprocessed by the file(write_CUB_filelist.py), so an error occurs in the CrossEntropyLoss (labels should be 0-99 for 100-class classification). If you want to set num_class as 100 in the CUB, I suggest you relabel base classes to 0-99 before the pre-training.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub, or mute the thread.

aiyolo commented 4 years ago

what's the form of the miniImagenet label? I haven't download the dataset yet