SamsungLabs / fbrs_interactive_segmentation

[CVPR2020] f-BRS: Rethinking Backpropagating Refinement for Interactive Segmentation https://arxiv.org/abs/2001.10331
Mozilla Public License 2.0
583 stars 94 forks source link

How to generate sbd_samples_weights.pkl #31

Closed gongl-cn closed 3 years ago

gongl-cn commented 3 years ago

I want to train another dataset,so I need to generate .pkl file myself(Third parameter represent ?)

happyCodingSusan commented 3 years ago

I want to ask the same question.

ksofiyuk commented 3 years ago

This pickle file contains average train losses for each sample in SBD. To obtain it we ran a trained model with frozen weights for 10 epoches with all augmentations to collect loss statistics. It is some sort of hard-negative mining for the whole dataset. If you use that file, "hard" samples with higher average losses will be sampled with an increased probabilities, rather than from uniform distribution. Our later internal experiments had shown that it didn't provide significant improvements on other datasets and we trained models on LVIS+COCO without that trick. Unfortunately, we don't have a separate script for that procedure as it was written in a "dirty" mode by modifying the existing train code in a temporary branch.

In general, there is no need to generate a pkl file for a new dataset, just use samples_scores_path=None and it will hardly affect the performance on your dataset.

gongl-cn commented 3 years ago

If I wanted to train this.pkl file, would I just have to freeze some of the parameters?

ksofiyuk commented 3 years ago

Simply put, yes, you need to freeze all model parameters and collect loss statistic for several epoches.

gongl-cn commented 3 years ago

ok,thank you very much!