Hi, I use multi-lingual bert(pre-trained weights are downloaded from Google official github) as my PTM like your paper in Table 5. Your XLT task result of EN-ZH is F1=57.5 / EM = 37.3. But my result of F1 is just about 20%. Our performance has a large margin and I don't think it's the hyperparameters that are causing this gap. Could you please release your source code in this repository. Thanks!
Hi, I use multi-lingual bert(pre-trained weights are downloaded from Google official github) as my PTM like your paper in Table 5. Your XLT task result of EN-ZH is F1=57.5 / EM = 37.3. But my result of F1 is just about 20%. Our performance has a large margin and I don't think it's the hyperparameters that are causing this gap. Could you please release your source code in this repository. Thanks!