model = XXXX(config)
model.compile(optimizer='adam')
earlystop = EarlyStopping(monitor='val_loss', min_delta=0, patience=10, verbose=1)
checkpoint = ModelCheckpoint(
os.path.join(config.dir_output, 'best-weights.h5'),
monitor='val_loss',
verbose=1,
save_best_only=True,
save_weights_only=True
)
model.train_model.fit_generator(train_generator, steps_per_epoch=steps_per_epoch,
validation_data=dev_generator,
validation_steps=dev_steps, verbose=1, callbacks=[earlystop, checkpoint],
shuffle=False, epochs=100)
```.
In addition, I modified the function load_google_bert, commented the line
`weights[w_id][vocab_size + TextEncoder.EOS_OFFSET] = saved[3 + TextEncoder.BERT_UNUSED_COUNT]`
because the variable `TextEncoder.BERT_SPECIAL_COUNT` is 4 instead of 5,
so the created model does not have so many weigths.
Hi @ChiuHsin,
Sorry for the super late reply, well your code seems alright and I think something is wrong in my code. I will look into it as soon as I have some free time.
Hello, I tried to simplify your code for NER task. I made a model as below
Then train the model by