Open aRookieMan opened 5 years ago
If dont fix pretrained VGG16s parameters , the model will increasing untill memory booms. But if fix these parameters , it works. Why?
t fix pretrained VGG16
code in train.py
for param in net.rpn.features.parameters(): param.requires_grad = False
谢谢
If don
t fix pretrained VGG16
s parameters , the model will increasing untill memory booms. But if fix these parameters , it works. Why?code in train.py