Closed nic25 closed 6 years ago
It might be useful, but in order to reduce clutter in the graph, I would recommend the lr_mult
not be displayed by default - maybe it should be controlled via an optional argument?
As for the presentation, I don't think scaling the saturation is going to be a good idea, from the informational point of view. Text will work much better, hiding the label for layers with zero learning rate could also be considered.
I'd be glad to review a PR with this feature, if you wanted to implement it.
Sure!
Excellent! I will close this issue for now - we shall have further conversation under the PR when you submit it.
dysplaying LRM dysplaying LRM/removing labels if LRM is 0
both work fine - i'll submit a pr when either can be optionally activated
@nic25 Could we retain the kernel_size/stride/pad/etc. info even when lr_mult is zero? Currently you're removing all information, while I think having the basic layer info regardless of the lr_mult would be useful. Other than that it looks good.
https://github.com/BVLC/caffe/blob/daf013931b31ed9c95250a89d09b7220badbcefe/python/caffe/draw.py#L89
IMHO including information about learning rate multipliers could be helpful when finetuning a network. We could: