Closed kakusikun closed 4 years ago
We set the parameters here: https://github.com/ifzhang/FairMOT/blob/12f50ca8821d7d3ecbebea8afdf673e421ebba32/src/lib/trains/mot.py#L35
Thanks for your quick reply. I know the uncertainty is set as a parameter. But it needs to be added to the optimizer to be updated not just make it a parameter. Still the question, where it is updated.
I use the parameters in the loss below: https://github.com/ifzhang/FairMOT/blob/4cdb3499a80d5de3bda79d6a4e43db5878570bbf/src/lib/trains/mot.py#L76
I mean something like https://github.com/ifzhang/FairMOT/blob/4cdb3499a80d5de3bda79d6a4e43db5878570bbf/src/train.py#L44 could you point out the same action on uncertainty like above.
Thank you very much for your question! I have added the parameters to optimizer and fixed the bug.
I found that you added a new param_group in the optimizer, so the optimizer will fail when it is resumed
maybe you should change the position of resume code
Thanks, I will fix the bug.
Here, it looks like that the uncertainty is used to learn multitask.
BUT, I can NOT find that the parameters is updated by any optimizer ... I just found that the instance of loss is made here.
Could you point out where the uncertainty is learned during training?