lm-sys / FastChat

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Apache License 2.0
36.36k stars 4.47k forks source link

why I cant get_conv_template of llama3 #3281

Open AII6 opened 4 months ago

AII6 commented 4 months ago

image

Emperorizzis commented 3 months ago

Hi, this is because Fastchat on the main branch only supports llama3, which is not supported in the latest version 0.36.2.

If you want to use llama-3, install Fastchat from main branch.