farizrahman4u / loopgpt

Modular Auto-GPT Framework
MIT License
1.43k stars 131 forks source link

agent = Agent(model="gpt-4") throws an error #18

Closed olavl closed 1 year ago

olavl commented 1 year ago

━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━Traceback (most recent call last): File "C:\Users..\LoopGPT 1.py", line 17, in agent.cli() File "C:..\AppData\Local\Programs\Python\Python310\lib\site-packages\loopgpt\agent.py", line 423, in cli cli(self, continuous=continuous) File "C:..\AppData\Local\Programs\Python\Python310\lib\site-packages\loopgpt\loops\repl.py", line 111, in cli resp = agent.chat() File "C:..\AppData\Local\Programs\Python\Python310\lib\site-packages\loopgpt\utils\spinner.py", line 137, in inner return func(*args, **kwargs) File "C:..\AppData\Local\Programs\Python\Python310\lib\site-packages\loopgpt\agent.py", line 173, in chat full_prompt, token_count = self.get_full_prompt(message) File "C:..\AppData\Local\Programs\Python\Python310\lib\site-packages\loopgpt\agent.py", line 90, in get_full_prompt token_count = count_tokens(prompt + userprompt, model=self.model) File "C:..\AppData\Local\Programs\Python\Python310\lib\site-packages\loopgpt\models\openai.py", line 48, in count_tokens enc = tiktoken.encoding_for_model(model) File "C:..\AppData\Local\Programs\Python\Python310\lib\site-packages\tiktoken\model.py", line 68, in encoding_for_model raise KeyError( KeyError: 'Could not automatically map gpt-4 to a tokeniser. Please use tiktok.get_encoding to explicitly get the tokeniser you expect.'

FayazRahman commented 1 year ago

Hoping the issue was fixed for you through our communication outside of github. Closing the issue. Feel free to reopen it if the issue persists for some reason.