hey,hengruo! I've got a question to consult you~
char_dim = 64 #Embedding dimension for char
In the paper of QANet, the auther stated ' Each character is represented as a trainable vector of dimension p2 = 200, meaning each word can be viewed as the concatenation of the embedding vectors for each of its characters.'
But, I found this in the config.py, char_dim = 64 ? And, when i run this repo,both the values of F1 score and EM were very low(F1 only close to 10).
What's more:
d_model = 96 #Dimension of connectors of each layer
is there should be 128?
Do those settings of these values affect the performance of model?
Thanks a lot, I'd appreciate that if you have time.
hey,hengruo! I've got a question to consult you~ char_dim = 64 #Embedding dimension for char In the paper of QANet, the auther stated ' Each character is represented as a trainable vector of dimension p2 = 200, meaning each word can be viewed as the concatenation of the embedding vectors for each of its characters.' But, I found this in the config.py, char_dim = 64 ? And, when i run this repo,both the values of F1 score and EM were very low(F1 only close to 10). What's more: d_model = 96 #Dimension of connectors of each layer is there should be 128? Do those settings of these values affect the performance of model? Thanks a lot, I'd appreciate that if you have time.