pytorch-labs / gpt-fast

Simple and efficient pytorch-native transformer text generation in <1000 LOC of python.
BSD 3-Clause "New" or "Revised" License
5.67k stars 513 forks source link

Request for Smaller Model Options (~1B Parameters) #210

Open deafTim opened 1 month ago

deafTim commented 1 month ago

Currently, I can only use small models. Is there an option to find an appropriate model around the 1B parameter size?