DachengLi1 / LongChat

Official repository for LongChat and LongEval
Apache License 2.0
504 stars 29 forks source link

Longchat inference configuration #23

Open SeekWrldTea opened 1 year ago

SeekWrldTea commented 1 year ago

What do you guys recommend as for the inference configuration for : temperature and repeat penalty etc ... ?

DachengLi1 commented 1 year ago

@SeekWrldTea temp=1.0, penalty=1.0: https://github.com/lm-sys/FastChat/blob/main/fastchat/serve/inference.py#L57