kaistAI / LangBridge

[ACL 2024] LangBridge: Multilingual Reasoning Without Multilingual Supervision
https://aclanthology.org/2024.acl-long.405/
69 stars 7 forks source link

Encoder and LM weights might not be loaded from pretrained checkpoints #3

Closed rahular closed 7 months ago

rahular commented 8 months ago

Hi @MattYoon, I was going through the code and did not find where the pretrained weights of the encoder and LM are loaded before the alignment starts. I found that the models are initialized using their configs (for example, here and here) which randomly initialize their weights.

Is this correct, or am I missing something basic?

MattYoon commented 8 months ago

Thanks again for reporting the issue.

You are correct, the weights should be loaded from pretrained models in train_langbridge.py but in the current code it wasn't working that way.

I fixed it in the latest commit. I accidentally removed that part when I was refactoring the code. The original experiments were ran with correct pretrained weights.

Please let me know if there are other issues!