Snowstormfly / Cross-modal-retrieval-MLAGT

Multi-Level and Attention-Guided Tokenization
7 stars 0 forks source link

Request for "Seen" and "Unseen" Data Table #2

Closed MiguelMC-UNEX closed 3 months ago

MiguelMC-UNEX commented 3 months ago

Dear Bo Yang, Chen Wang, Xiaoshuang Ma, Beiping Song, Zhuang Liu, and Fangde Sun,

I hope this message finds you well. I recently read your paper titled "Zero-shot sketch-based remote sensing image retrieval based on multi-level and attention-guided tokenization" and found it to be incredibly insightful. The analysis and results you presented on "seen" and "unseen" data are particularly intriguing and crucial for my current research.

I am writing to inquire if the table containing the "seen" and "unseen" data, as referenced in the paper, is available. Access to this table would greatly aid in replicating your experiments and further understanding the methodologies you have described. Additionally, I would like to request the code used to generate these tables, as it would be instrumental in my research efforts.

This information is essential for my undergraduate thesis, and having access to both the data table and the code would significantly enhance the quality and depth of my work.

Could you please provide information on where I can find this table and the associated code, or if they will be made available for public access? Any guidance or resources you could share would be greatly appreciated.

Thank you for your time and for your valuable contribution to the field.

Best regards,

Miguel

Snowstormfly commented 3 months ago

Dear Miguel

In response to your request for "seen" and "unseen" class data, we have currently uploaded the RSketch_Ext dataset used in this paper to Baidu Netdisk. This dataset is an extension of the RSketch dataset. You can view and download the data by clicking on "Baidu web disk" on the Github page. The code used in this paper has been open-sourced on Github. For the code of other baseline methods used for comparison in this paper, I suggest you obtain the corresponding code from the references listed in this paper.

Best regards,

Bo Yang

MiguelMC-UNEX commented 3 months ago

Dear Miguel

In response to your request for "seen" and "unseen" class data, we have currently uploaded the RSketch_Ext dataset used in this paper to Baidu Netdisk. This dataset is an extension of the RSketch dataset. You can view and download the data by clicking on "Baidu web disk" on the Github page. The code used in this paper has been open-sourced on Github. For the code of other baseline methods used for comparison in this paper, I suggest you obtain the corresponding code from the references listed in this paper.

Best regards,

Bo Yang

Dear Bo Yang,

Thank you for the information provided. However, all I need is the code to generate the table of "seen" and "unseen" classes. I apologize for any confusion caused.

Best regards,

Miguel

Snowstormfly commented 3 months ago

Dear Miguel

In our experiments, we did not use any code to automatically generate "seen" and "unseen" classes. The division of "seen" and "unseen" classes was done manually based on the classification method in Table 3 of the paper, where we divided the 20 categories of the RSketch_Ext dataset into 4-folds for training and testing to evaluate the performance of our model. Therefore, if you want to test the model's effectiveness, you should also manually modify the categories of the training and testing sets to meet the corresponding requirements before testing.

Best regards,

Bo Yang