clp-research / clembench

A Framework for the Systematic Evaluation of Chat-Optimized Language Models as Conversational Agents and an Extensible Benchmark
MIT License
19 stars 26 forks source link

issue #47: add option to set max_token to be generated #48

Closed phisad closed 4 months ago

phisad commented 4 months ago

This should do the trick

sherzod-hakimov commented 4 months ago

wasn't the argument "-m" used for model and now it's for max_new_tokens?

Maybe we can use another letter or combination, e.g. "mt" (max tokens).