jncraton / languagemodels

Explore large language models in 512MB of RAM
https://jncraton.github.io/languagemodels/
MIT License
1.18k stars 78 forks source link

Is it possible to manually load the trained weights? #25

Closed sandoche closed 1 year ago

sandoche commented 1 year ago

Right now running lm.do() is downloading the models weights and other files.

I am working on a tiny experiment and I would like to package the trained weights. So basically download them and load them manually.

Is it something possible?

jncraton commented 1 year ago

This should be possible. Can you share a little more about what you are trying to do? This package currently uses the Hugging Face Hub to manage and cache downloaded files. Once files are downloaded on the first run, there shouldn't be a need download anything.

If you bundle the populated cache directory in your package, you shouldn't need to download any additional files. There is more information available in the cache documentation.

sandoche commented 1 year ago

Nice that should work by specifying the cache location. I'll try later when I find some time. Thanks for your quick response.

I'm trying to build a tiny app packaged with all the models that can work offline 🙂

jncraton commented 1 year ago

Sounds good. I'd love to see what you come up with. Good luck and have fun! :smiley: