PiotrNawrot / nanoT5

Fast & Simple repository for pre-training and fine-tuning T5-style models
Apache License 2.0
971 stars 74 forks source link

fine-tuning error: No module named adaptive.moe #5

Closed fancyisbest closed 1 year ago

fancyisbest commented 1 year ago

Hi Piotr,

your work on the nanoT5 repository is amazing, we like it and want to reproduce.

I study NLP recently, so I have been trying to learn how to run T5 as you do, I successfully pretrained ,but failed fintuning. and i couldn't execute the fine-tuning python command because the module adaptive.moe doesn't exist anywhere. i also google it ,and can't find the module, so could you help me to figure it out?

Thanks a lot for your help ! fan

PiotrNawrot commented 1 year ago

I'm sorry, it's mistyped in the README.

Please use: python -m nanoT5.main ... instead of python -m adaptive.moe ...

Thanks for this catch!

PiotrNawrot commented 1 year ago

Please let me know if it works now :)