PraisonAI application combines AutoGen and CrewAI or similar frameworks into a low-code solution for building and managing multi-agent LLM systems, focusing on simplicity, customisation, and efficient human-agent collaboration.
I've added the following to the .env file:
OPENAI_MODEL_NAME="mixtral-8x7b-32768"
OPENAI_API_KEY="gsk_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
OPENAI_API_BASE="https://api.groq.com/openai/v1"
but i get the following error:
Traceback (most recent call last):
File "/home/andrea/repos/scripts/AI/praisonAI/venv/bin/praisonai", line 8, in <module>
sys.exit(main())
^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/praisonai/__main__.py", line 7, in main
praison_ai.main()
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/praisonai/cli.py", line 76, in main
self.agent_file = generator.generate()
^^^^^^^^^^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/praisonai/auto.py", line 45, in generate
response = self.client.chat.completions.create(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/instructor/patch.py", line 570, in new_create_sync
response = retry_sync(
^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/instructor/patch.py", line 387, in retry_sync
for attempt in max_retries:
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/tenacity/__init__.py", line 347, in __iter__
do = self.iter(retry_state=retry_state)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/tenacity/__init__.py", line 325, in iter
raise retry_exc.reraise()
^^^^^^^^^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/tenacity/__init__.py", line 158, in reraise
raise self.last_attempt.result()
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/instructor/patch.py", line 390, in retry_sync
response = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/openai/_utils/_utils.py", line 275, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/openai/resources/chat/completions.py", line 667, in create
return self._post(
^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1233, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/openai/_base_client.py", line 922, in request
return self._request(
^^^^^^^^^^^^^^
File "/home/andrea/repos/scripts/AI/praisonAI/venv/lib/python3.11/site-packages/openai/_base_client.py", line 1013, in _request
raise self._make_status_error_from_response(err.response) from None
openai.AuthenticationError: Error code: 401 - {'error': {'message': 'Invalid API Key', 'type': 'invalid_request_error', 'code': 'invalid_api_key'}}```
On the other hand, if I export the variables like so:
export OPENAI_MODEL_NAME="mixtral-8x7b-32768"
export OPENAI_API_KEY="gsk_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"
export OPENAI_API_BASE="https://api.groq.com/openai/v1"
Groq works perfectly fine.
Hi,
Great project!
I've added the following to the .env file: OPENAI_MODEL_NAME="mixtral-8x7b-32768" OPENAI_API_KEY="gsk_xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" OPENAI_API_BASE="https://api.groq.com/openai/v1"
but i get the following error: