PPPW / deep-learning-random-explore

194 stars 34 forks source link

How can we cut the last few layers of nasnet?? #11

Closed Ume0128 closed 5 years ago

Ume0128 commented 5 years ago

Hi,

I appreciate this github repo. When we want to cut the last few layers of nasnet, you recommend to change it into an identity function. I researched, but I can't do this. Could you tell me how to do this??

Thanks in advance for your help.

PPPW commented 5 years ago

Hi @Ume0128, If you're talking about the "cnn_archs.ipynb", that part is in:

def nasnetamobile(pretrained=False):
    pretrained = 'imagenet' if pretrained else None
    model = pretrainedmodels.nasnetamobile(pretrained=pretrained, num_classes=1000)
    model.logits = identity
    return nn.Sequential(model)

Was this not working for you?

Note that this is just workaround for Cadene's implementation. In "cnn_archs_more.ipynb", there's an alternative way for NASNet and you don't need to set the last few layers to identity.

Ume0128 commented 5 years ago

Thank you for your reply.

I'm sorry that my question was confusing.

Actually, I want to create a new model which is composed of the first several layers of NasNet (e.g. (cell_11)) and my custom layers. So, I need to extract the output of middle layer of nasnet. But the below code didn't work because the model.children() could not be converted to list. nn.Sequential(*list(model.children())[:cut]

Do you know how to do that?

Thank you in advance.

PPPW commented 5 years ago

I see, I would recommend using the models provided by pytorchcv. I have some examples in this notebook. At the bottom part I added an example for you, which cuts some part of the NASNet.

Ume0128 commented 5 years ago

Hi, @PPTW. Thank for you reply. This is what I would like to know! I'm grateful for your courteous response.