amdegroot / ssd.pytorch

A PyTorch Implementation of Single Shot MultiBox Detector
MIT License
5.15k stars 1.75k forks source link

ValueError: optimizing a parameter that doesn't require gradients #109

Open santhoshdc1590 opened 6 years ago

santhoshdc1590 commented 6 years ago

I wanted to freeze the first two layers of the network. Based on this I wrote a code to freeze the first two layers like this before the optimisation line 105 on train.py

Here's the code

Freeze weights

for layer,param in enumerate(net.parameters()): if layer == 1 or layer == 2: param.requires_grad = False else: param.requires_grad = True

I'm getting this error on this line optimizer = optim.SGD(net.parameters(), lr=args.lr,momentum=args.momentum, weight_decay=args.weight_decay)

File "train.py", line 155, in optimizer = optim.SGD(net.parameters(), lr=args.lr,momentum=args.momentum, weight_decay=args.weight_decay) File "/Users/name/.virtualenvs/test/lib/python3.6/site-packages/torch/optim/sgd.py", line 57, in init super(SGD, self).init(params, defaults) File "/Users/name/.virtualenvs/test/lib/python3.6/site-packages/torch/optim/optimizer.py", line 39, in init self.add_param_group(param_group) File "/Users/name/.virtualenvs/test/lib/python3.6/site-packages/torch/optim/optimizer.py", line 153, in add_param_group raise ValueError("optimizing a parameter that doesn't require gradients") ValueError: optimizing a parameter that doesn't require gradients

What's wrong any help would be appreciated. I'm stuck

santhoshdc1590 commented 6 years ago

Thanks to this article Some important Pytorch tasks - A concise summary from a vision researcher

My code for freezing the layers in not exactly correct.

I was able to get the layers using this screen shot 2018-02-22 at 2 35 27 pm

OUTOUT screen shot 2018-02-22 at 2 36 14 pm

now to just freeze the vgg layer

screen shot 2018-02-22 at 2 28 46 pm OUTPUT

screen shot 2018-02-22 at 2 30 10 pm

When the optimizer has to update the weights( by default requires_grad=True while using optimiser I guess) So on using this screen shot 2018-02-22 at 2 32 52 pm

we get an error screen shot 2018-02-22 at 2 39 00 pm

just change the net.parameters() to filter(lambda p: p.requires_grad,net.parameters())

screen shot 2018-02-22 at 2 33 04 pm