SeungyounShin / Llama2-Code-Interpreter

Make Llama2 use Code Execution, Debug, Save Code, Reuse it, Access to Internet
686 stars 89 forks source link

RUN: python chatbot.py --path Seungyoun/codellama-7b-instruct-pad ERROR: PermissionError: [Errno 13] Permission denied #31

Open Winhalls opened 2 months ago

Winhalls commented 2 months ago

执行 python chatbot.py --path Seungyoun/codellama-7b-instruct-pad 时报错:路径无权限。 实际上已经授权,且未被占用,怎么解决呢?谢谢!

detail :

python chatbot.py --path Seungyoun/codellama-7b-instruct-pad Traceback (most recent call last): File "D:\Winhalls_Lau\SproutsData\Project\AI\Llama2-Code-Interpreter\chatbot.py", line 238, in gradio_launch(model_path=args.path, load_in_4bit=True) File "D:\Winhalls_Lau\SproutsData\Project\AI\Llama2-Code-Interpreter\chatbot.py", line 104, in gradio_launch chatbot = gr.Chatbot(height=820, avatar_images="./assets/logo2.png") ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio\component_meta.py", line 159, in wrapper return fn(self, *kwargs) ^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio\components\chatbot.py", line 133, in init self.serve_static_file(avatar_images[0]), ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio\blocks.py", line 354, in serve_static_file return client_utils.synchronize_async( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio_client\utils.py", line 858, in synchronize_async return fsspec.asyn.sync(fsspec.asyn.get_loop(), func, args, **kwargs) # type: ignore ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\fsspec\asyn.py", line 103, in sync raise return_result File "D:\Python\Lib\site-packages\fsspec\asyn.py", line 56, in _runner result[0] = await coro ^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio\processing_utils.py", line 479, in async_move_files_to_cache return await client_utils.async_traverse( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio_client\utils.py", line 1002, in async_traverse return await func(json_obj) ^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio\processing_utils.py", line 453, in _move_to_cache temp_file_path = await block.async_move_resource_to_block_cache( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio\blocks.py", line 275, in async_move_resource_to_block_cache temp_file_path = processing_utils.save_file_to_cache( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio\processing_utils.py", line 257, in save_file_to_cache temp_dir = hash_file(file_path) ^^^^^^^^^^^^^^^^^^^^ File "D:\Python\Lib\site-packages\gradio\processing_utils.py", line 189, in hash_file with open(file_path, "rb") as f: ^^^^^^^^^^^^^^^^^^^^^ PermissionError: [Errno 13] Permission denied: 'D:\Winhalls_Lau\SproutsData\Project\AI\Llama2-Code-Interpreter'