Significant-Gravitas / AutoGPT

AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
https://agpt.co
MIT License
167.25k stars 44.19k forks source link

ValueError: not enough values to unpack (expected 2, got 1) #1818

Closed xai26285 closed 1 year ago

xai26285 commented 1 year ago

Duplicates

Steps to reproduce 🕹

Traceback (most recent call last): File "E:\autogpt\my_folder\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "E:\autogpt\my_folder\lib\runpy.py", line 86, in _run_code exec(code, run_globals) File "E:\autogpt\Auto-GPT\autogpt__main.py", line 572, in main() File "E:\autogpt\Auto-GPT\autogpt\main.py", line 396, in main agent.start_interaction_loop() File "E:\autogpt\Auto-GPT\autogpt\main__.py", line 448, in start_interaction_loop assistant_reply = chat.chat_with_ai( File "E:\autogpt\Auto-GPT\autogpt\chat.py", line 95, in chat_with_ai ) = generate_context(prompt, relevant_memory, full_message_history, model) File "E:\autogpt\Auto-GPT\autogpt\chat.py", line 43, in generate_context current_tokens_used = token_counter.count_message_tokens(current_context, model) File "E:\autogpt\Auto-GPT\autogpt\token_counter.py", line 24, in count_message_tokens encoding = tiktoken.encoding_for_model(model) File "E:\autogpt\my_folder\lib\site-packages\tiktoken\model.py", line 75, in encoding_for_model return get_encoding(encoding_name) File "E:\autogpt\my_folder\lib\site-packages\tiktoken\registry.py", line 63, in get_encoding enc = Encoding(**constructor()) File "E:\autogpt\my_folder\lib\site-packages\tiktoken_ext\openai_public.py", line 64, in cl100k_base mergeable_ranks = load_tiktoken_bpe( File "E:\autogpt\my_folder\lib\site-packages\tiktoken\load.py", line 115, in load_tiktoken_bpe return { File "E:\autogpt\my_folder\lib\site-packages\tiktoken\load.py", line 115, in return { ValueError: not enough values to unpack (expected 2, got 1)

Current behavior 😯

pip install --upgrade tiktoken

Expected behavior 🤔

应该怎么做,才能运行

Your prompt 📝


# Paste your prompt here
```Traceback (most recent call last):
  File "E:\autogpt\my_folder\lib\runpy.py", line 196, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "E:\autogpt\my_folder\lib\runpy.py", line 86, in _run_code
    exec(code, run_globals)
  File "E:\autogpt\Auto-GPT\autogpt\__main__.py", line 572, in <module>
    main()
  File "E:\autogpt\Auto-GPT\autogpt\__main__.py", line 396, in main
    agent.start_interaction_loop()
  File "E:\autogpt\Auto-GPT\autogpt\__main__.py", line 448, in start_interaction_loop
    assistant_reply = chat.chat_with_ai(
  File "E:\autogpt\Auto-GPT\autogpt\chat.py", line 95, in chat_with_ai
    ) = generate_context(prompt, relevant_memory, full_message_history, model)
  File "E:\autogpt\Auto-GPT\autogpt\chat.py", line 43, in generate_context
    current_tokens_used = token_counter.count_message_tokens(current_context, model)
  File "E:\autogpt\Auto-GPT\autogpt\token_counter.py", line 24, in count_message_tokens
    encoding = tiktoken.encoding_for_model(model)
  File "E:\autogpt\my_folder\lib\site-packages\tiktoken\model.py", line 75, in encoding_for_model
    return get_encoding(encoding_name)
  File "E:\autogpt\my_folder\lib\site-packages\tiktoken\registry.py", line 63, in get_encoding
    enc = Encoding(**constructor())
  File "E:\autogpt\my_folder\lib\site-packages\tiktoken_ext\openai_public.py", line 64, in cl100k_base
    mergeable_ranks = load_tiktoken_bpe(
  File "E:\autogpt\my_folder\lib\site-packages\tiktoken\load.py", line 115, in load_tiktoken_bpe
    return {
  File "E:\autogpt\my_folder\lib\site-packages\tiktoken\load.py", line 115, in <dictcomp>
    return {
ValueError: not enough values to unpack (expected 2, got 1)
XUHuAIRuOGU0231 commented 1 year ago

I also encountered the same problem, but my friend was able to use it normally, and I don't know where I went wrong

hemangjoshi37a commented 1 year ago

It seems like there is a ValueError: not enough values to unpack (expected 2, got 1) error in the Auto-GPT repository. The error occurs due to a missing value in the function call. To solve this issue, you can try upgrading tiktoken by running the command "pip install --upgrade tiktoken". If the issue still persists, you can try reinstalling tiktoken using "pip uninstall tiktoken" and then "pip install tiktoken". Let me know if this solves your issue.

xai26285 commented 1 year ago

似乎有一个值错误:自动 GPT 存储库中没有足够的值来解压缩(预期为 2,得到 1)错误。此错误是由于函数调用中缺少值而发生的。要解决此问题,您可以尝试通过运行命令“pip install --upgrade tiktoken”来升级tiktoken。如果问题仍然存在,您可以尝试使用“pip uninstall tiktoken”和“pip install tiktoken”重新安装tiktoken。让我知道这是否解决了您的问题。 我的tiktoken已经更新了,按照你的方法已经使用了,但是还是显示同样的问题

XUHuAIRuOGU0231 commented 1 year ago

Auto-GPT 存储库中似乎存在 ValueError: not enough values to unpack (expected 2, got 1) 错误。该错误是由于函数调用中缺少值而发生的。要解决此问题,您可以尝试通过运行命令“pip install --upgrade tiktoken”来升级tiktoken。如果问题仍然存在,您可以尝试使用“pip uninstall tiktoken”然后“pip install tiktoken”重新安装 tiktoken。如果这能解决您的问题,请告诉我。

Your method has been used, but it is still the same error

AhrenFullStop commented 1 year ago

Additionally this solution wont work if you are running this in a docker container.

RKP64 commented 1 year ago

C:\Users\91952\Auto-GPT-0.2.1> python -m autogpt Welcome back! Would you like me to return to being Entrepreneur-GPT? Continue with the last settings? Name: Entrepreneur-GPT Role: an AI designed to autonomously develop and run businesses with the Goals: ['Increase net worth', 'Grow Twitter Account', 'Develop and manage multiple businesses autonomously'] Continue (y/n): y Using memory of type: PineconeMemory Using Browser: chrome Traceback (most recent call last): File "C:\Users\91952\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 196, in _run_module_as_main return _run_code(code, main_globals, None, File "C:\Users\91952\AppData\Local\Programs\Python\Python310\lib\runpy.py", line 86, in _run_code exec(code, run_globals) File "C:\Users\91952\Auto-GPT-0.2.1\autogpt__main.py", line 53, in main() File "C:\Users\91952\Auto-GPT-0.2.1\autogpt\main__.py", line 49, in main agent.start_interaction_loop() File "C:\Users\91952\Auto-GPT-0.2.1\autogpt\agent\agent.py", line 65, in start_interaction_loop assistant_reply = chat_with_ai( File "C:\Users\91952\Auto-GPT-0.2.1\autogpt\chat.py", line 95, in chat_with_ai ) = generate_context(prompt, relevant_memory, full_message_history, model) File "C:\Users\91952\Auto-GPT-0.2.1\autogpt\chat.py", line 43, in generate_context current_tokens_used = token_counter.count_message_tokens(current_context, model) File "C:\Users\91952\Auto-GPT-0.2.1\autogpt\token_counter.py", line 25, in count_message_tokens encoding = tiktoken.encoding_for_model(model) File "C:\Users\91952\AppData\Local\Programs\Python\Python310\lib\site-packages\tiktoken\model.py", line 75, in encoding_for_model return get_encoding(encoding_name) File "C:\Users\91952\AppData\Local\Programs\Python\Python310\lib\site-packages\tiktoken\registry.py", line 63, in get_encoding enc = Encoding(**constructor()) File "C:\Users\91952\AppData\Local\Programs\Python\Python310\lib\site-packages\tiktoken_ext\openai_public.py", line 64, in cl100k_base mergeable_ranks = load_tiktoken_bpe( File "C:\Users\91952\AppData\Local\Programs\Python\Python310\lib\site-packages\tiktoken\load.py", line 115, in load_tiktoken_bpe return { File "C:\Users\91952\AppData\Local\Programs\Python\Python310\lib\site-packages\tiktoken\load.py", line 115, in return { ValueError: not enough values to unpack (expected 2, got 1)

RKP64 commented 1 year ago

i had the same issue since many days can anybody help to resolve this

XUHuAIRuOGU0231 commented 1 year ago

There are no professionals to solve this problem, but I heard that it seems necessary to use a paid account for openai api. If you can upgrade your paid account, you can try it.If you solve this problem in this way, please reply to me and let me know that it is effective.

---- Replied Message ---- | From | @.> | | Date | 04/18/2023 01:32 | | To | @.> | | Cc | @.>@.> | | Subject | Re: [Significant-Gravitas/Auto-GPT] ValueError: not enough values to unpack (expected 2, got 1) (Issue #1818) |

i had the same issue since many days can anybody help to resolve this

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>

bitfirefly commented 1 year ago

Auto-GPT 存储库中似乎存在 ValueError: not enough values to unpack (expected 2, got 1) 错误。该错误是由于函数调用中缺少值而发生的。要解决此问题,您可以尝试通过运行命令“pip install --upgrade tiktoken”来升级tiktoken。如果问题仍然存在,您可以尝试使用“pip uninstall tiktoken”然后“pip install tiktoken”重新安装 tiktoken。如果这能解决您的问题,请告诉我。

Your method has been used, but it is still the same error

same

bitfirefly commented 1 year ago

my problem solved, i forgot to install git

XUHuAIRuOGU0231 commented 1 year ago

I didn't solve the problem of running an error on the computer host, I think there may be a problem with my computer environment. So I used ubuntu22 in vscoe, and now I can use autogpt normally.If you haven't tried this method, you can try it, which may solve your problem.

---- Replied Message ---- | From | @.> | | Date | 04/18/2023 14:57 | | To | @.> | | Cc | @.>@.> | | Subject | Re: [Significant-Gravitas/Auto-GPT] ValueError: not enough values to unpack (expected 2, got 1) (Issue #1818) |

my problem solved, i forgot to install git

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>

chairwa commented 1 year ago

the same issue

aChenspring commented 1 year ago

Warning: The file 'auto-gpt.json' does not exist. Local memory would not be saved to a file. Using memory of type: LocalCache Traceback (most recent call last): File "D:\gpt\Auto-GPT-master\Auto-GPT-master\scripts\main.py", line 461, in main() File "D:\gpt\Auto-GPT-master\Auto-GPT-master\scripts\main.py", line 365, in main assistant_reply = chat.chat_with_ai( ^^^^^^^^^^^^^^^^^^ File "D:\gpt\Auto-GPT-master\Auto-GPT-master\scripts\chat.py", line 77, in chat_with_ai next_message_to_add_index, current_tokens_used, insertion_index, current_context = generate_context( ^^^^^^^^^^^^^^^^^ File "D:\gpt\Auto-GPT-master\Auto-GPT-master\scripts\chat.py", line 40, in generate_context current_tokens_used = token_counter.count_message_tokens(current_context, model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\gpt\Auto-GPT-master\Auto-GPT-master\scripts\token_counter.py", line 17, in count_message_tokens encoding = tiktoken.encoding_for_model(model) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Program Files\Python311\Lib\site-packages\tiktoken\model.py", line 75, in encoding_for_model return get_encoding(encoding_name) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "D:\Program Files\Python311\Lib\site-packages\tiktoken\registry.py", line 63, in get_encoding enc = Encoding(**constructor()) ^^^^^^^^^^^^^ File "D:\Program Files\Python311\Lib\site-packages\tiktoken_ext\openai_public.py", line 64, in cl100k_base mergeable_ranks = load_tiktoken_bpe( ^^^^^^^^^^^^^^^^^^ File "D:\Program Files\Python311\Lib\site-packages\tiktoken\load.py", line 115, in load_tiktoken_bpe return { ^ File "D:\Program Files\Python311\Lib\site-packages\tiktoken\load.py", line 117, in for token, rank in (line.split() for line in contents.splitlines() if line) ^^^^^^^^^^^ ValueError: not enough values to unpack (expected 2, got 1)

aChenspring commented 1 year ago

import base64

def load_tiktoken_bpe(tiktoken_bpe_file: str) -> dict[bytes, int]: """ 从文件中读取tiktok BPE编码,并将其转换为字典形式

参数:
tiktoken_bpe_file:包含tiktok BPE编码的文件路径

返回值:
字典,表示tiktok BPE编码。字典的键是一个字节串(由base64解码而来),值是一个整数(编码的排名)
"""
# 读取文件内容
contents = read_file_cached(tiktoken_bpe_file)

# 将文件内容转换为字典形式
# 字典的键是一个字节串(由base64解码而来),值是一个整数(编码的排名)
return {
    base64.b64decode(token): int(rank)
    for token, rank in (line.split() for line in contents.splitlines() if line)
}
aChenspring commented 1 year ago

C:\Users\itcast\AppData\Local\Temp\data-gym-cache 这个位置下的文件 下载不完整 导致的 兄弟们

XUHuAIRuOGU0231 commented 1 year ago

可以请教一下具体的解决方法吗?

---- 回复的原邮件 ---- | 发件人 | @.> | | 日期 | 2023年04月21日 13:40 | | 收件人 | @.> | | 抄送至 | @.>@.> | | 主题 | Re: [Significant-Gravitas/Auto-GPT] ValueError: not enough values to unpack (expected 2, got 1) (Issue #1818) |

C:\Users\itcast\AppData\Local\Temp\data-gym-cache 这个位置下的文件 下载不完整 导致的 兄弟们

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>

aChenspring commented 1 year ago

就是https://openaipublic.blob.core.windows.net/encodings/cl100k_base.tiktoken 这个位置下载新的文件 替换你原有的C:\Users\itcast\AppData\Local\Temp\data-gym-cache下不完整的文件

XUHuAIRuOGU0231 commented 1 year ago

非常感谢,刚刚我删除了你提到的这个文件夹,重新用git命令下载了文件,现在这个问题已经消失了

---- 回复的原邮件 ---- | 发件人 | @.> | | 日期 | 2023年04月21日 13:49 | | 收件人 | @.> | | 抄送至 | @.>@.> | | 主题 | Re: [Significant-Gravitas/Auto-GPT] ValueError: not enough values to unpack (expected 2, got 1) (Issue #1818) |

就是https://openaipublic.blob.core.windows.net/encodings/cl100k_base.tiktoken 这个位置下载新的文件 替换你原有的C:\Users\itcast\AppData\Local\Temp\data-gym-cache下不完整的文件

— Reply to this email directly, view it on GitHub, or unsubscribe. You are receiving this because you are subscribed to this thread.Message ID: @.***>

yangpeng0607 commented 1 year ago

非常感谢,等了几天终于解决问题了,删除~\AppData\Local\Temp\data-gym-cache文件夹,重新运行就好了

就是https://openaipublic.blob.core.windows.net/encodings/cl100k_base.tiktoken 这个位置下载新的文件 替换你原有的C:\Users\itcast\AppData\Local\Temp\data-gym-cache下不完整的文件

ntindle commented 1 year ago

Outside repo problem?

Ayan-Bandyopadhyay commented 1 year ago

I'm still running into this issue even on the latest version of tiktoken (0.3.3)

mouzhi commented 1 year ago

提供MAC下解决该问题的步骤:1.终端输入env,找到TMPDIR的目录,finder进去,找到data-gym-cache这个目录,删掉它,重新运行autogpt就正常了

Jadensha commented 8 months ago

data-gym-cache

删除这个文件夹,确实好了

hemangjoshi37a commented 8 months ago

Hello everyone,

Thank you for your detailed descriptions of the issue you're facing with Auto-GPT. Based on the shared experiences, the ValueError: not enough values to unpack (expected 2, got 1) seems to be related to the BPE encoding files used by the tiktoken package. Here are some steps to potentially resolve this issue:

  1. Ensure All Dependencies are Installed: Make sure you have all necessary dependencies installed, including git. The absence of git or other dependencies might cause problems during the installation or updating of packages.

  2. Update tiktoken: Although many of you have already done this, ensure that tiktoken is updated to the latest version using:

    pip install --upgrade tiktoken
  3. Clear the data-gym-cache Directory: It seems that an incomplete download or corrupt file in the data-gym-cache directory could be causing the error. Follow these steps:

    • Locate and delete the data-gym-cache directory. The location may vary based on your system. For example, on Windows, it might be under C:\Users\[YourUsername]\AppData\Local\Temp\data-gym-cache.
    • After deleting the directory, try running Auto-GPT again. It should attempt to redownload the necessary files.
  4. Manual Download of BPE File: If the above steps don't work, you might need to manually download the correct BPE encoding file. Go to https://openaipublic.blob.core.windows.net/encodings/cl100k_base.tiktoken and download the file. Replace the corresponding file in your data-gym-cache directory with this downloaded file.

  5. Running in a Different Environment: If you're still facing issues, consider running Auto-GPT in a different environment, such as a Docker container or a different operating system like Ubuntu, as some users have reported success with these methods.

Please try these steps and let us know if the issue persists. Your feedback is valuable in resolving this problem.