Open SeekWrldTea opened 1 year ago
What do you guys recommend as for the inference configuration for : temperature and repeat penalty etc ... ?
@SeekWrldTea temp=1.0, penalty=1.0: https://github.com/lm-sys/FastChat/blob/main/fastchat/serve/inference.py#L57
What do you guys recommend as for the inference configuration for : temperature and repeat penalty etc ... ?