Closed FlyEgle closed 6 years ago
You have to define your own data generator and make the input X have two parameters.
hist = model.fit_generator(generator=data_generator_centerloss(X=[x_train, y_train_a_class_value], Y=[y_train_a_class,y_train_a, random_y_train_a], batch_size=batch_size),
steps_per_epoch=train_num // batch_size,
validation_data=([x_test,y_test_a_class_value], [y_test_a_class,y_test_a, random_y_test_a]),
epochs=nb_epochs, verbose=1,
callbacks=callbacks)
Can you have a look my py script? i am not good at myself define keras func
An example function of fit_generator. It should fit the center loss data structure.
def data_generator_centerloss(X,Y,batch_size):
X1 = X[0]
X2 = X[1]
Y1 = Y[0]
Y2 = Y[1]
Y3 = Y[2]
while True:
idxs = np.random.permutation(len(X1))
X1 = X1[idxs] #images
X2 = X2[idxs] #labels for center loss
Y1 = Y1[idxs]
Y2 = Y2[idxs]
Y3 = Y3[idxs]
p1,p2,q1,q2,q3 = [],[],[],[],[]
for i in range(len(X1)):
p1.append(X1[i])
p2.append(X2[i])
q1.append(Y1[i])
q2.append(Y2[i])
q3.append(Y3[i])
if len(p1) == batch_size:
yield [np.array(p1),np.array(p2)],[np.array(q1),np.array(q2),np.array(q3)]
p1,p2,q1,q2,q3 = [],[],[],[],[]
if p1:
yield [np.array(p1),np.array(p2)],[np.array(q1),np.array(q2),np.array(q3)]
p1,p2,q1,q2,q3 = [],[],[],[],[]
train_generator = generator_from_directory(data_train_agu(), train_path, SIZE, BATCH_SIZE, 'categorical')
val_generator = generator_from_directory(data_val_agu(), val_path, SIZE, BATCH_SIZE, 'categorical')
if isCenterloss: random_y_train = np.random.rand(123520, 1) # x_train random_y_test = np.random.rand(7119, 1) # x_test
history2 = model_centerloss.fit_generator(train_generator, [y_train, random_y_train],
batch_size=batch_size, epochs=epochs,
verbose=1, validation_data=(val_generator, [y_test, random_y_test]),
callbacks=[lr, modelcheckpoint])
Yes. You have to redefine them by yourself. The above data generator example I give you should be enough.
ok,i do it.Can i have your wechat ?May be another bug cause.
this is right?but it have wrong in keras
@FlyEgle Did you get this to work?
yes, i have work for a myself generator
Same here. Works wonders!
@FlyEgle Will you be kind enough to share your code? Thanks!
hi @FlyEgle Does your code do well using the center_loss. But it seemly does't work my code, the accuracy is low. i don't know how to settle it.Do you check the #3 and give me some suggestions. Thanks.
fit
history_softmax = resnet_model.fit_generator( train_generator, steps_per_epoch=123520//BATCH_SIZE, epochs=EPOCHS, validation_data=val_generator, validation_steps=7119//BATCH_SIZE, callbacks=[lr] )
fit_generator
history2 = model_centerloss.fit_generator(train_generator, [y_train, random_y_train], batch_size=batch_size, epochs=epochs, verbose=1, validation_data=(val_generator, [y_test, random_y_test]), callbacks=[lr, modelcheckpoint])