wgcban / HyperTransformer

[CVPR'22] HyperTransformer: A Textural and Spectral Feature Fusion Transformer for Pansharpening
https://www.wgcban.com/research#h.ar24vwqlm021
MIT License
124 stars 16 forks source link

Regarding the issues of parameters in the config_HSIT.json file, the number of heads in multi-head attention, and the calculation of metrics. #12

Open HaiMaoShiTang opened 1 year ago

HaiMaoShiTang commented 1 year ago

@wgcban Hello sir, thank you for your outstanding work and providing code on HyperTransformer. In order to cite your paper better, I have a few questions. Firstly, in the paper, it was mentioned that the best performance was achieved when the number of heads in the multi-head attention was 16. However, the best model provided by you in config_HSIT.json was using 8 heads, and there were errors in the RGB parameters in the same file. Can you provide the correct best model and config_HSIT.json file? It is difficult to reproduce your method without the correct files. Secondly, for the calculation of the metrics, did you use the results generated by the code or did you re-calculate them using MATLAB? Your response is crucially important, and I am very grateful for your work.

HaiMaoShiTang commented 1 year ago

@wgcban There are many errors in the code, and many errors are found later. Could you please provide the final correct code and the best model you trained? We look forward to repeating your excellent work.

hachreak commented 1 year ago

Hi @wgcban @HaiMaoShiTang there are any news about working code/config/pretrained? :) Thanks a lot