Closed lytum closed 4 years ago
Hi, recently I also focus on the research in jointly training of slots filling and intent classification with BERT. May i ask you, why do you save the trained model in 'xx.h5'. what does that mean and what's the different with the normal checkpoint?
Thanks advance! Ye
Hi, recently I also focus on the research in jointly training of slots filling and intent classification with BERT. May i ask you, why do you save the trained model in 'xx.h5'. what does that mean and what's the different with the normal checkpoint?
Thanks advance! Ye