-
Support running AutoTx against a locally-running LLM. This could be done via an environment variable where, if present, disable the use of OpenAI and instead uses the local LLM.
Relevant Docs: http…
-
### Name
libraiger
### Discord Username (if applicable)
ondrowest
### Additional Context
I bring a wealth of experience in the realm of AI and web technologies, accumulated over several years…
-
I using Crew AI 0.28.8. i used the callback function according to crew AI documentation but now the callback function is not working on the crew AI.
def callback_function(output: TaskOutput):
…
-
### Name
libraiger
### Discord Username (if applicable)
ondrowest
### Additional Context
I bring a wealth of experience in the realm of AI and web technologies, accumulated over several years…
-
### Name
libraiger
### Discord Username (if applicable)
ondrowest
### Additional Context
I bring a wealth of experience in the realm of AI and web technologies, accumulated over several years…
-
### Describe the bug
After setting up Agents and workflow with local endpoints. Getting this error message ```openai.OpenAIError: The api_key client option must be set either by passing api_key to …
-
"**_It seems we encountered an unexpected error while trying to use the tool. This was the error: 'OPENAI_API_KEY'_**"
I got this message while trying to use tools. Can you please let me know how to …
-
Hi, where to define llm model? I cant find a place where we can define llm.
-
Your CrewAI script is saved here: scripts/crewai-autocrew-20240201-095215-A-hypothetical-software-development-comp-gamma.py
File "/mnt/d/interpreter/autocrew/scripts/crewai-autocrew-20240201-095058…
-
I am new here, wanted to try the sample. But having error
ImportError: cannot import name 'Agent' from partially initialized module 'crewai' (most likely due to a circular import)
Any suggestion…