shuaichaochao / HybridHash

HybridHash: Hybrid Convolutional and Self-Attention Deep Hashing for Image Retrieval (ICMR 2024)
8 stars 1 forks source link

about the configuration for the run #2

Closed Shayne-Pro closed 3 months ago

Shayne-Pro commented 5 months ago

I ran 300 epochs with the following configuration, and the best MAP (mean average precision) was only 0.348. Could you please share the specific configuration you used for the run? codetoimg-snippet

shuaichaochao commented 5 months ago

The parameter settings are the same as in the file (train.py), have you loaded the pre-trained model weights (jx_nest_base-8bc41011.pth) yet?

YinqiChen-DHQ commented 4 months ago

I loaded the pre-trained model weights (jx_nest_base-8bc41011.pth) and also ran 300 epochs(64 batch_size), but the best MAP was only 0.323 in imagenet (64bit). Could you please share the number of loss that can achieve better results?

shuaichaochao commented 4 months ago

I loaded the pre-trained model weights (jx_nest_base-8bc41011.pth) and also ran 300 epochs(64 batch_size), but the best MAP was only 0.323 in imagenet (64bit). Could you please share the number of loss that can achieve better results?

I've uploaded the training log for your reference.

shuaichaochao commented 3 months ago

I loaded the pre-trained model weights (jx_nest_base-8bc41011.pth) and also ran 300 epochs(64 batch_size), but the best MAP was only 0.323 in imagenet (64bit). Could you please share the number of loss that can achieve better results?

The previous (HybridHash.py) model code was set up as a no-interaction module, resulting in very low MAP results, which has been corrected. The model code has been set to add the interaction module.

Shayne-Pro commented 3 months ago

I appreciate your response. Upon re-executing the code, I was able to successfully replicate the standard outcomes.