SeungyounShin / Llama2-Code-Interpreter

Make Llama2 use Code Execution, Debug, Save Code, Reuse it, Access to Internet
687 stars 89 forks source link

cannot access local variable 'sentencepiece_model_pb2' where it is not associated with a value #15

Closed chen198328 closed 10 months ago

chen198328 commented 10 months ago

(CodeLLama)@ubuntu:/Projects/Llama2-Code-Interpreter$ python3 chatbot.py --path Seungoun/codellama-7b-instruct-pad/ [2023-08-31 14:13:17,845] [INFO] [real_accelerator.py:158:get_accelerator] Setting ds_accelerator to cuda (auto detect) /Projects/Llama2-Code-Interpreter/chatbot.py:104: GradioUnusedKwargWarning: You have unused kwarg parameters in Chatbot, please remove them: {'avatar_images': './assets/logo2.png'} chatbot = gr.Chatbot(height=820, avatar_images="./assets/logo2.png") Traceback (most recent call last): File "/Projects/Llama2-Code-Interpreter/chatbot.py", line 238, in gradio_launch(model_path=args.path, load_in_4bit=True) File "/Projects/Llama2-Code-Interpreter/chatbot.py", line 108, in gradio_launch interpreter = StreamingLlamaCodeInterpreter( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Projects/Llama2-Code-Interpreter/code_interpreter/LlamaCodeInterpreter.py", line 56, in init self.tokenizer = LlamaTokenizer.from_pretrained( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/anaconda3/envs/CodeLLama/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 1854, in from_pretrained return cls._from_pretrained( ^^^^^^^^^^^^^^^^^^^^^ File "/anaconda3/envs/CodeLLama/lib/python3.11/site-packages/transformers/tokenization_utils_base.py", line 2017, in _from_pretrained tokenizer = cls(*init_inputs, **init_kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/anaconda3/envs/CodeLLama/lib/python3.11/site-packages/transformers/models/llama/tokenization_llama.py", line 156, in init self.sp_model = self.get_spm_processor() ^^^^^^^^^^^^^^^^^^^^^^^^ File "/anaconda3/envs/CodeLLama/lib/python3.11/site-packages/transformers/models/llama/tokenization_llama.py", line 167, in get_spm_processor model_pb2 = import_protobuf() ^^^^^^^^^^^^^^^^^ File "/envs/CodeLLama/lib/python3.11/site-packages/transformers/convert_slow_tokenizer.py", line 40, in import_protobuf return sentencepiece_model_pb2 ^^^^^^^^^^^^^^^^^^^^^^^ UnboundLocalError: cannot access local variable 'sentencepiece_model_pb2' where it is not associated with a value

SeungyounShin commented 10 months ago

what is your transformer version?

I have currently

Using transformer for specific commit b/c codellama

duronxx commented 10 months ago

me too image

duronxx commented 10 months ago

pip install protobuf。 The problem is solved

chen198328 commented 10 months ago

pip install protobuf。 The problem is solved

It works. Thank you so much