randaller / llama-chat

Chat with Meta's LLaMA models at home made easy
GNU General Public License v3.0
834 stars 118 forks source link

Is example-chat.py ready to use GPU? #23

Open kaykyr opened 1 year ago

kaykyr commented 1 year ago

I have a RTX 4090 with 24GB VRAM + 64GB RAM is this example-chat.py ready to work with them? Thanks!