microsoft / mttl

Building modular LMs with parameter-efficient fine-tuning.
MIT License
78 stars 7 forks source link

use gpt-neo-125m for testing #99

Closed matheper closed 2 months ago

matheper commented 2 months ago

use gpt-neo-125m for testing avoids tokenizer's dependency to sentencepiece