hi,last-one,
I have tried to run your code,I found after remove multi-lr setting, the loss can be converged, if add multi-lr setting, the loss can't converge.
and I add the bias decay = 0 in the multi-lr part, same as openpose, but I don't think this lead to the bug.
have you faced this problem?
hi,last-one, I have tried to run your code,I found after remove multi-lr setting, the loss can be converged, if add multi-lr setting, the loss can't converge. and I add the bias decay = 0 in the multi-lr part, same as openpose, but I don't think this lead to the bug. have you faced this problem?