Berkeley-Data / hpt

MIT License
2 stars 3 forks source link

convert moco pretraining to sen12ms evaluation pretraining model #48

Open taeil opened 3 years ago

taeil commented 3 years ago

,,,,

taeil commented 3 years ago

image

left: resnet50 from pytorch right: openselfsup - resnet50

suryagutta commented 3 years ago

Attached the model structures comparison between Pytorch and Moco model from OpenSelfSup. image

suryagutta commented 3 years ago

Updated the code and checked in. The converted/extracted backbone is uploaded to: /home/ubuntu/SEN12MS/pretrained/moco (Taeil's system) and to /home/taeil/SEN12MS/pretrained/moco (Colorado's new system) silvery-oath7-2rr3864e_backbone.pth (extracted backbone weigths) silvery-oath7-2rr3864e_queryencoder.pth (created this also.. just incase if we want to use weights from query encoder!)

taeil commented 3 years ago

updated main_train to take parameters for pt_type (bb, qe), pt_name (root model name), updated documentation

taeil commented 3 years ago

PR

suryagutta commented 3 years ago

Merged the pull request after the review.

suryagutta commented 3 years ago

Added Input module data under a separate key 'input_module' and uploaded the converted models to /home/taeil/data/moco_to_resnet50_models (Colorado system). input_module data is extracted from query encoder of the moco model. image