KaiyangZhou / deep-person-reid

Torchreid: Deep learning person re-identification in PyTorch.
https://kaiyangzhou.github.io/deep-person-reid/
MIT License
4.24k stars 1.14k forks source link

How to train a hacnn with triplet loss? #377

Open Messagel opened 3 years ago

Messagel commented 3 years ago

When I setting a yaml as follow: `model: name: 'hacnn' pretrained: True

data: type: 'image' sources: ['market1501'] targets: ['market1501'] height: 160 width: 64 combineall: False transforms: ['random_flip'] save_dir: 'log/hacnn_market1501_triplet'

loss: name: 'triplet' softmax: label_smooth: True

sampler: train_sampler: 'RandomIdentitySampler'

train: optim: 'amsgrad' lr: 0.0003 max_epoch: 60 batch_size: 32 fixbase_epoch: 5 open_layers: ['fc_global', 'fc_local', 'classifier_global', 'classifier_local'] lr_scheduler: 'single_step' stepsize: [20]

test: batch_size: 100 dist_metric: 'euclidean' normalize_feature: False evaluate: False eval_freq: -1 rerank: False `

and run with python scripts/main.py --config-file configs/im_hacnn_triplet_160x64_amsgrad.yaml --root data -s market1501 -t market1501 data.save_dir log/hacnn_market1501_triplet

I got this bug: Traceback (most recent call last): File "scripts/main.py", line 193, in <module> main() File "scripts/main.py", line 139, in main check_cfg(cfg) File "scripts/main.py", line 94, in check_cfg 'The output of classifier is not included in the computational graph' AssertionError: The output of classifier is not included in the computational graph

So how to train a hacnn with triplet loss?

KaiyangZhou commented 3 years ago

triplet loss doesn't need the classifier layer because the loss is only computed on the output features

but your cfg has a fixbase_epoch=5 during which only the classifier layer is updated but it is actually unused by the triplet loss

so in order to use triplet loss only (unless you want to combine triplet loss with softmax loss), you should set fixbase_epoch=0

I've added a condition in main.py to check this setting, see https://github.com/KaiyangZhou/deep-person-reid/blob/master/scripts/main.py#L91

please use the latest commit

Messagel commented 3 years ago

triplet loss doesn't need the classifier layer because the loss is only computed on the output features

but your cfg has a fixbase_epoch=5 during which only the classifier layer is updated but it is actually unused by the triplet loss

so in order to use triplet loss only (unless you want to combine triplet loss with softmax loss), you should set fixbase_epoch=0

I've added a condition in main.py to check this setting, see https://github.com/KaiyangZhou/deep-person-reid/blob/master/scripts/main.py#L91

please use the latest commit

`loss: name: 'triplet' softmax: label_smooth: True triplet: weight_t: 1.0 weight_x: 1.0

train: optim: 'amsgrad' lr: 0.01 max_epoch: 60 batch_size: 32 fixbase_epoch: 5 open_layers: ['fc', 'classifier'] lr_scheduler: 'multi_step' stepsize: [40, 50]` Is it a truth way to train both softmax and triplet together? Can these two losses help model training together like one function: criteration = loss_xweight_x + loss_tweight_t?(Whether I need to rewrite a loss and add triplet and softmax to update?)

KaiyangZhou commented 3 years ago

yes, if you set both weights to one, or any values bigger than one, then the two losses are used jointly

I highly encourage you to check the code at engine/image/triplet.py