Open CrazyJazzHands opened 1 month ago
The second error seems like you used Llama3 JSON mode model from Groq, this is not compatible with streaming, so I would recommend standard mode Llama models. I have removed JSON mode model getters from the models.py for future versions. The first error is related to API key, are you using GROQ_API_KEY or API_KEY_GROQ in .env? API_KEY_GROQ is loaded and passed automatically by the framework, the first might work as well as it is probably requested internally by the interface if no API key is provided, but that is not verified information.
Thanks for the reply, I've sorted the second, it was an oversight on my part. In the first error I tried changing API_KEY_GROQ to GROQ_API_KEY in .env and still got the same error. I only get that error when I use docker. When I use an ide with conda (using API_KEY_GROQ) the model loads fine. I wanted to use Docker because it is recommended by the author but I think I'll use conda until I can figure out how to get it to run on Docker
I'm using a macbook M1 and I tried using conda then docker, both presenting issues. After running 'docker compose up' I get the following output from terminal even though my groq api is in the .env file:
I did not have this issue before when I did not use docker. However I did get a weird issue. I used conda + zed to run agent-zero. I got the agent to start but when I enter a prompt I get the following:
I'd appreciate any help. I'd also like to commend the author as this is a very interesting project. I cannot wait to get it running and start experimenting.