Open zheyuye opened 4 years ago
It seems that there exist a conflict in the num_attention_heads which is set as 32 in the albert_config.json included in the model tar file downloaded from TF Hub instead of 16 as defined in the same page.
num_attention_heads
albert_config.json
It seems that there exist a conflict in the
num_attention_heads
which is set as 32 in thealbert_config.json
included in the model tar file downloaded from TF Hub instead of 16 as defined in the same page.