frdel / agent-zero

Agent Zero AI framework
Other
3.49k stars 816 forks source link

BadRequestError + Cannot find api in .env file (docker) #37

Open CrazyJazzHands opened 1 month ago

CrazyJazzHands commented 1 month ago

I'm using a macbook M1 and I tried using conda then docker, both presenting issues. After running 'docker compose up' I get the following output from terminal even though my groq api is in the .env file:

[+] Running 1/0
 ✔ Container agent-zero-server-1  Creat...                                 0.0s 
Attaching to server-1
server-1  | Initializing framework...
server-1  | Traceback (most recent call last):
server-1  |   File "/app/main.py", line 166, in <module>
server-1  |     initialize()
server-1  |   File "/app/main.py", line 20, in initialize
server-1  |     chat_llm = models.get_groq_llama70b_json(temperature=0.2)
server-1  |                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1  |   File "/app/models.py", line 70, in get_groq_llama70b_json
server-1  |     return ChatGroq(model_name="llama3-70b-8192", temperature=temperature, api_key=api_key, model_kwargs={"response_format": {"type": "json_object"}}) # type: ignore
server-1  |            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
server-1  |   File "/usr/local/lib/python3.12/site-packages/pydantic/v1/main.py", line 341, in __init__
server-1  |     raise validation_error
server-1  | pydantic.v1.error_wrappers.ValidationError: 1 validation error for ChatGroq
server-1  | __root__
server-1  |   Did not find groq_api_key, please add an environment variable `GROQ_API_KEY` which contains it, or pass `groq_api_key` as a named parameter. (type=value_error)
server-1 exited with code 1

I did not have this issue before when I did not use docker. However I did get a weird issue. I used conda + zed to run agent-zero. I got the agent to start but when I enter a prompt I get the following:

{
    "system_error": "Traceback (most recent call last):
  File "/opt/anaconda3/lib/python3.12/site-packages/groq/_base_client.py", line 920, in request
    return self._request(
           ^^^^^^^^^^^^^^
  File "/opt/anaconda3/lib/python3.12/site-packages/groq/_base_client.py", line 1018, in _request
    raise self._make_status_error_from_response(err.response) from None
groq.BadRequestError: Error code: 400 - {'error': {'message': 'response_format` does not support streaming', 'type': 'invalid_request_error'}}
"
}

I'd appreciate any help. I'd also like to commend the author as this is a very interesting project. I cannot wait to get it running and start experimenting.

frdel commented 1 month ago

The second error seems like you used Llama3 JSON mode model from Groq, this is not compatible with streaming, so I would recommend standard mode Llama models. I have removed JSON mode model getters from the models.py for future versions. The first error is related to API key, are you using GROQ_API_KEY or API_KEY_GROQ in .env? API_KEY_GROQ is loaded and passed automatically by the framework, the first might work as well as it is probably requested internally by the interface if no API key is provided, but that is not verified information.

CrazyJazzHands commented 1 month ago

Thanks for the reply, I've sorted the second, it was an oversight on my part. In the first error I tried changing API_KEY_GROQ to GROQ_API_KEY in .env and still got the same error. I only get that error when I use docker. When I use an ide with conda (using API_KEY_GROQ) the model loads fine. I wanted to use Docker because it is recommended by the author but I think I'll use conda until I can figure out how to get it to run on Docker