jzhang38 / TinyLlama

The TinyLlama project is an open endeavor to pretrain a 1.1B Llama model on 3 trillion tokens.
Apache License 2.0
7.31k stars 426 forks source link

Usage documentation #129

Closed simensandhaug closed 5 months ago

simensandhaug commented 6 months ago

Was trying to use this but running script.sh didnt work after downloading requirements in a venv. Also tried running chat_gradio but gradio package was not even in requirements. Worked after i pip installed it separately but got an error after trying to chat. This really needs some usage documentation.

hunter-lee1 commented 6 months ago

We have updated it to address these issues in chat_gradio. You can also try the new version on Google Colab tinyllama-gradio