Open Dan1900 opened 7 years ago
@Dan1900 Have you solved this? Was it an error with your build, or something to be set in a prototxt file?
Did you set up the flag to force to not backward in prototxt?
I've since solved this. In my prototxt I had loss_weight: 1 in a custom softmax loss layer.. removing that seems to have fixed it. On the other hand it could be something else but the broader point is it works now :)
@Dan1900 hello, have u fixed it? I got the same problem......
i also got the same problem? how to deal with this issue.
Has anyone solved this problem? I also encountered this problem and did not find obvious errors.
when i train my own model, i found all layers do not need backward computation,but there is nothing error information,why?