rmihaylov / mpttune

Tune MPTs
Apache License 2.0
84 stars 17 forks source link

What the structure of the dataset needs to be ? #1

Open MohamedAliRashad opened 1 year ago

MohamedAliRashad commented 1 year ago

I want to finetune MPT-7B-Instruct but i don't know how ? Is mpttune can help me with this ?

rmihaylov commented 1 year ago

Yes, just added it in the README

eschaffn commented 1 year ago

Will it work with oasst1? What kind of processing needs to be done in order to make it work?

rmihaylov commented 1 year ago

You have to add it in data.py as a separate class Train...

MohamedAliRashad commented 1 year ago

@rmihaylov How to extend the context length of MPT-7B-Instruct ?

benam2 commented 1 year ago

@rmihaylov thanks for the great repo. I want to fine tune the mpt-7b-instruct on both Alpaca and Openassistant. Could you please share your idea regarding this. Do I need to mix the dataset and start fine tuning? Or need to add another script to read from openassistant and start fine tuning that once fine tuning Alpaca finished?

Thanks~