which47 / LLMCL

Analyzing and Reducing Catastrophic Forgetting in Parameter Efficient Tuning
19 stars 5 forks source link

Refactor the project #4

Closed 2proveit closed 2 months ago

2proveit commented 2 months ago

simplified the whole process

2proveit commented 2 months ago

refactor the whole project and use a deepspeed config to handle the whole distributed training, collect the config file to a single .py file, and test methods for training, but still need more test