hengruo / QANet-pytorch

A PyTorch implementation of QANet.
MIT License
345 stars 67 forks source link

why the dim of character vec set to 64? #18

Open JewelChen2019 opened 4 years ago

JewelChen2019 commented 4 years ago

hey,hengruo! I've got a question to consult you~ char_dim = 64 #Embedding dimension for char In the paper of QANet, the auther stated ' Each character is represented as a trainable vector of dimension p2 = 200, meaning each word can be viewed as the concatenation of the embedding vectors for each of its characters.' But, I found this in the config.py, char_dim = 64 ? And, when i run this repo,both the values of F1 score and EM were very low(F1 only close to 10). What's more: d_model = 96 #Dimension of connectors of each layer is there should be 128? Do those settings of these values affect the performance of model? Thanks a lot, I'd appreciate that if you have time.

xfy998 commented 3 years ago

hello, jewelChen! I also encountered the same problem as you. May I ask if you have solved it now