google-research / bert

TensorFlow code and pre-trained models for BERT
https://arxiv.org/abs/1810.04805
Apache License 2.0
37.91k stars 9.57k forks source link

Can't retrain with different models in colab BERT FineTuning with Cloud TPU: Sentence and Sentence-Pair Classification Tasks #402

Open xaviergonzalez opened 5 years ago

xaviergonzalez commented 5 years ago

Thank you so much for this great tool!

I was trying to play around with your colab notebook titled

"BERT finetuning tasks in 5 minutes with Cloud TPU"

I wanted to switch from Bert Base to Bert Large, and so made the change

BERT_MODEL = 'uncased_L-24_H-1024_A-16' #@param {type:"string"}

However, when I tried to retrain the model it seemed like no retraining actually occurred, as it finished instantly and produced the message:

"INFO:tensorflow:Skipping training since max_steps has already saved. INFO:tensorflow:training_loop marked as finished"

What am I doing wrong/how can I fix this?

berfubuyukoz commented 5 years ago

Hi @xaviergonzalez, have you been able to solve it? I am facing the exact same issue.

berfubuyukoz commented 5 years ago

Hi again, deleting the checkpoint file stored under the model in the bucket solved the issue for me, as it was pointed out in this post.