muellerberndt / mini-agi

MiniAGI is a simple general-purpose autonomous agent based on the OpenAI API.
MIT License
2.81k stars 294 forks source link

GPT4 is not available #69

Closed try-agaaain closed 1 year ago

try-agaaain commented 1 year ago

When I test with gpt4 (It work well when use gpt-3.5-turbo in mini-agi), I get an error like this:

Working directory is /home/codespace/miniagi
Traceback (most recent call last):
  File "/workspaces/mini-agi/./miniagi.py", line 506, in <module>
    miniagi.think()
  File "/workspaces/mini-agi/./miniagi.py", line 302, in think
    response_text = self.agent.predict(
  File "/home/codespace/.local/lib/python3.10/site-packages/thinkgpt/llm.py", line 82, in predict
    return self.generate([[prompt]], remember=remember).generations[0][0].text
  File "/home/codespace/.local/lib/python3.10/site-packages/thinkgpt/llm.py", line 75, in generate
    result = self.execute_with_context_chain.predict(prompt=prompt, context='Nothing')
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chains/llm.py", line 151, in predict
    return self(kwargs)[self.output_key]
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chains/base.py", line 116, in __call__
    raise e
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chains/base.py", line 113, in __call__
    outputs = self._call(inputs)
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chains/llm.py", line 57, in _call
    return self.apply([inputs])[0]
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chains/llm.py", line 118, in apply
    response = self.generate(input_list)
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chains/llm.py", line 62, in generate
    return self.llm.generate_prompt(prompts, stop)
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chat_models/base.py", line 82, in generate_prompt
    raise e
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chat_models/base.py", line 79, in generate_prompt
    output = self.generate(prompt_messages, stop=stop)
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chat_models/base.py", line 54, in generate
    results = [self._generate(m, stop=stop) for m in messages]
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chat_models/base.py", line 54, in <listcomp>
    results = [self._generate(m, stop=stop) for m in messages]
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 266, in _generate
    response = self.completion_with_retry(messages=message_dicts, **params)
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 228, in completion_with_retry
    return _completion_with_retry(**kwargs)
  File "/home/codespace/.local/lib/python3.10/site-packages/tenacity/__init__.py", line 289, in wrapped_f
    return self(f, *args, **kw)
  File "/home/codespace/.local/lib/python3.10/site-packages/tenacity/__init__.py", line 379, in __call__
    do = self.iter(retry_state=retry_state)
  File "/home/codespace/.local/lib/python3.10/site-packages/tenacity/__init__.py", line 314, in iter
    return fut.result()
  File "/usr/local/python/3.10.8/lib/python3.10/concurrent/futures/_base.py", line 451, in result
    return self.__get_result()
  File "/usr/local/python/3.10.8/lib/python3.10/concurrent/futures/_base.py", line 403, in __get_result
    raise self._exception
  File "/home/codespace/.local/lib/python3.10/site-packages/tenacity/__init__.py", line 382, in __call__
    result = fn(*args, **kwargs)
  File "/home/codespace/.local/lib/python3.10/site-packages/langchain/chat_models/openai.py", line 226, in _completion_with_retry
    return self.client.create(**kwargs)
  File "/home/codespace/.local/lib/python3.10/site-packages/openai/api_resources/chat_completion.py", line 25, in create
    return super().create(*args, **kwargs)
  File "/home/codespace/.local/lib/python3.10/site-packages/openai/api_resources/abstract/engine_api_resource.py", line 153, in create
    response, _, api_key = requestor.request(
  File "/home/codespace/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 230, in request
    resp, got_stream = self._interpret_response(result, stream)
  File "/home/codespace/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 624, in _interpret_response
    self._interpret_response_line(
  File "/home/codespace/.local/lib/python3.10/site-packages/openai/api_requestor.py", line 687, in _interpret_response_line
    raise self.handle_error_response(
openai.error.InvalidRequestError: The model: `gpt-4` does not exist

My GPT4 api key is normal, in autoGPT it works fine, it's not clear to me what the problem is.