togethercomputer / OpenChatKit

Apache License 2.0
9k stars 1.01k forks source link

Setting --max-tokens produces an error no matter what the value #91

Open Adrian-1234 opened 1 year ago

Adrian-1234 commented 1 year ago

$python inference/bot.py --model togethercomputer/Pythia-Chat-Base-7B --max-tokens 128

Loading togethercomputer/Pythia-Chat-Base-7B to cuda:0... Loading checkpoint shards: 100%|????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????????| 2/2 [00:06<00:00, 3.26s/it] Welcome to OpenChatKit shell. Type /help or /? to list commands.

hi Traceback (most recent call last): File "Togethercomputer-GPT-NeoXT-Chat-Base-20B/OpenChatKit-main/inference/bot.py", line 269, in main() File "Togethercomputer-GPT-NeoXT-Chat-Base-20B/OpenChatKit-main/inference/bot.py", line 255, in main OpenChatKitShell( File "anaconda3/lib/python3.9/cmd.py", line 138, in cmdloop stop = self.onecmd(line) File "anaconda3/lib/python3.9/cmd.py", line 217, in onecmd return func(arg) File "AI_Projects/Togethercomputer-GPT-NeoXT-Chat-Base-20B/OpenChatKit-main/inference/bot.py", line 134, in do_say output = self._model.do_inference( File "AI_Projects/Togethercomputer-GPT-NeoXT-Chat-Base-20B/OpenChatKit-main/inference/bot.py", line 75, in do_inference outputs = self._model.generate( File "anaconda3/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "anaconda3/lib/python3.9/site-packages/transformers/generation/utils.py", line 1295, in generate generation_config.max_length = generation_config.max_new_tokens + input_ids_seq_length TypeError: can only concatenate str (not "int") to str

orangetin commented 1 year ago

Thanks for the report! I was able to reproduce the error. You can fix it by adding type=int to the parser.add_argument for max_tokens. I can have a PR up soon to fix this (and specify types for other arguments as well)