yxli2123 / LoftQ

MIT License
180 stars 15 forks source link

Does it support Mixtral 8x7B? #15

Open iMountTai opened 5 months ago

iMountTai commented 5 months ago

After I modified the code, there was a problem with the gate size of lora weight. After loading, I found that lora_a was the same as base_layer, and a size_mismatch problem occurred. Thanks!

yxli2123 commented 4 months ago

Hi @iMountTai, have you resolved this issue? Could you please provide the code you modified?