neptune-ai / open-solution-mapping-challenge

Open solution to the Mapping Challenge :earth_americas:
https://www.crowdai.org/challenges/mapping-challenge
MIT License
377 stars 96 forks source link

Transfer learning using the available weights #233

Open Ahsanr312 opened 3 years ago

Ahsanr312 commented 3 years ago

Suggest an idea for this project Hi! I have been following the repository and used the available weights for prediction on my own dataset which resulted in poor segmentation. While thinking about how to achieve better results on my own dataset, I came up with a few ideas as follows:

  1. Train the architecture only on my custom dataset
  2. Train the architecture on MS COCO + custom dataset
  3. Train the architecture using transfer learning on custom dataset

I believe, I won't get good results by choosing first approach whereas the second approach can get me some good results however I am looking for any method for the third approach mentioned above.

Looking forward for hearing from you all.

jakubczakon commented 3 years ago

Hi @Ahsanr312 thank you for the suggestion.

Actually, you can do any of the options right now. This repository is more about how to train it (how we trained it) than "generalizable" results. We wanted to share our work on this competition with the world.

One thing to keep in mind is that this was built for buildings so perhaps you should use a different dataset than MS COCO (cityscape is a good one I believe).

Good luck with fine-tuning!

Ahsanr312 commented 3 years ago

I have looked for steps to fine-tune the model but couldn't fine anything. Can you help me with it? @jakubczakon

data-overload commented 3 years ago

@Ahsanr312 Were you able to figure out how to tune the model? I prepared the training directory as per the Readme with my own images+annotations but get a DecodeError running the train command. Edit: that error is fixed by upgrading neptune client, however I still struggle to produce new tuned weights (no errors but I don't see the output from training).