Closed InderpreetSinghChhabra01 closed 3 years ago
Hi @InderpreetSinghChhabra0,
Can you elaborate what you were doing?
It is really hard to answer your question with little information you provided.
@tatsy
I have trained the model, Now I want to use the classification model of cvaegan as a separate classifier which can tell me to which class given input image belongs to so I did the following:
inputs = Input(shape=(64,64,3))
x = BasicConvLayer(filters=128, strides=(2, 2))(inputs)
x = BasicConvLayer(filters=256, strides=(2, 2))(x)
x = BasicConvLayer(filters=256, strides=(2, 2))(x)
x = BasicConvLayer(filters=512, strides=(2, 2))(x)
f = Flatten()(x) x = Dense(1024)(f) x = Activation('relu')(x)
x = Dense(10)(x) predictions = Activation('softmax')(x)
model = Model(inputs=inputs, outputs=predictions)
print model.summary()
model.load_weights('cls_trainer.hdf5') ##error occured in this line y=model(image) print y
and I got the following error ValueError: You are trying to load a weight file containing 1 layers into a model with 10 layers.
and then I obtained the structure of weight file which is shown in the file, can you explain me the structure or how can I accomplish my task.
Hi @InderpreetSinghChhabra01,
I guess your problem is because you try to load entire network to local network. To fix the problem, you first construct the entire network of CVAE-GAN, then load the weight parameters to the constructed network, and finally, use the classifier components for your purpose.
Hope it helps!
Actually, weights are divided into four separate hdf5 files for four models i.e. encoder, decoder, classifier, discriminator and I am using the weight file of classifier only with its model, I don't think that I need to create the entire model.
So, you mean you modified the program updated to this repository? How did you save the partial parameter files?
The program itself outputs four different parameter file for four models in cvaegan inside the folder:- keras-generative/output/cvaegan/weights/epoch_00200/ cls_trainer.hdf5 dec_trainer.hdf5 dis_trainer.hdf5 enc_trainer.hdf5
I am using cls_trainer.hdf5 for my task.
Hi @InderpreetSinghChhabra01,
Sorry for my misunderstanding. Yes, the program saves four different files for the subsidiary components, and I understand your problem.
I guess your problem is because you load the parameter file of trainer to the network without training parameters. The file xxx_trainer.hdf5
includes not only the network parameters but also the training parameters. The training parameters are saved to suspend and resume the training.
The error message,
ValueError: You are trying to load a weight file containing 1 layers into a model with 10 layers.
means that the file contains the parameters of 1 layer trainer (this internally holds network parameters) but the target network contains 10 layers (of only the classifier).
Therefore, your problem will be solved by defining also the dummy trainer and loading the parameters to this dummy trainer.
The code will be like:
inputs = Input(shape=(64,64,3))
c_real = Input(shape=(10,))
x = BasicConvLayer(filters=128, strides=(2, 2))(inputs)
x = BasicConvLayer(filters=256, strides=(2, 2))(x)
x = BasicConvLayer(filters=256, strides=(2, 2))(x)
x = BasicConvLayer(filters=512, strides=(2, 2))(x)
f = Flatten()(x)
x = Dense(1024)(f)
x = Activation('relu')(x)
x = Dense(10)(x)
c_pred = Activation('softmax')(x)
# You can define model here
model = Model(inputs=inputs, outputs=c_pred)
model.summary()
# You should define also the loss and trainer
c_loss = ClassifierLossLayer()([c_real, c_pred])
cls_trainer = Model(inputs=[inputs, c_real], outputs=[c_loss])
# Then, load parameters to the trainer!
cls_trainer.load_weights('cls_trainer.hdf5')
Could you first check the above code work properly, please?
Hi @tatsy ,
I tried the above code and got the following error:-
RuntimeError: Graph disconnected: cannot obtain value for tensor Tensor("input_3:0", shape=(?, 64, 64, 3), dtype=float32) at layer "input_3". The following previous layers were accessed without issue: []
Hi @InderpreetSinghChhabra01,
Oh, input arguments for ClassifierLossLayer were misplaced! Correctly, it should be:
c_loss = ClassifierLossLayer()([c_real, c_pred])
(The true class labels are provided for the 1st argument)
Hi @tatsy,
After making those changes I encountered the same error.
ValueError: You are trying to load a weight file containing 1 layers into a model with 10 layers.
Since our cls_trainer.hdf5 file contains ['input_7' 'input_8' 'model_4' 'classifier_loss_layer_1'] I think if we can extract the weights of 'model_4' only from hdf5 file which is indeed the nework weights than our problem can be solved.
Hi @InderpreetSinghChhabra01,
Umm..., that's rather strange for me. So, could you check whether you can resume the training using your current pre-trained data? In this case, all the parameter files not only cls_trainer.hdf5 will be loaded, but it's my purpose.
If you succeed in that, I think you can load only the parameters in cls_trainer.hdf5. This program saves the parameters in:
https://github.com/tatsy/keras-generative/blob/master/models/base.py#L160
Here, the networks registered with store_to_save
are saved. When you load the parameters, it is simply loaded in:
https://github.com/tatsy/keras-generative/blob/master/models/base.py#L172
Again, the networks registered with store_to_save
are loaded.
The above code I provided is just duplicating this procedure only for cls_trainer
. So, I guess you can load the cls_trainer's parameters with the above code once if you can resume the training.
Hii, I have trained cvaegan on my dataset but I want to test it as an image translation model for a single image and also I want to use classifier function separately for one input image. I tried doing it but I got the following error for the classification model.
ValueError: You are trying to load a weight file containing 1 layers into a model with 10 layers.