Open MohamedAliRashad opened 1 year ago
I'm not developer and I also wanna know about it. I think we need change line 18 of mpttune/model/mpt/config.py and re-install package to reflect this change (pip uninstall mpttune and python setup.py install and python setup_cuda.py install). Unfortunately I don't have dataset to try this...
How to change the context length of MPT-7B-Instruct in finetuning (I keep getting the error that it's only 2048 tokens length) ?