Hi, script is becoming a separate valuable system now. Great!! I just found issue. I have Ollama server as separate instance. Its accessible in default port but during crew creation it turns out its looking for ollama folder. 👍
Welcome to AutoCrew!
Use the -? or -h command line options to display help information.
Settings can be modified within "config.ini". Scripts are saved in the "scripts" subdirectory.
If you experience any errors, please create an issue on Github and attach "autocrew.log":
https://github.com/yanniedog/autocrew/issues/new
AutoCrew version: 3.1.0
You are running the latest version of AutoCrew.
Please specify your overall goal: research newest trends in fashion
How many alternative crews do you wish to generate? [3]:
Do you want the crews to be ranked afterwards? (yes/no) [yes]: yes
Use existing settings (LLM endpoint: openai, Model: gpt-3.5-turbo)? (y/n) [yes]: n
1) ollama
2) openai
Select the LLM endpoint: 1
Your downloaded models:
1. codellama:13b-python-q6_K
2. dolphin-mixtral:latest
3. llama-pro:8b-instruct-q8_0
4. llama-pro:8b-text-q8_0
5. llama-pro:latest
6. mistral:latest
7. mixtral:latest
8. neural-chat:latest
9. nous-hermes2-mixtral:latest
10. openhermes:7b-mistral-v2.5-q8_0
11. openhermes:latest
12. phi:latest
13. tinyllama:latest
14. [Download a NEW model]
Enter the number of the model to download, or type a model name:
Your choice (type 'back' to go back): 11
You have selected the model: openhermes:latest
Use the same settings for CrewAI scripts? (y/n): y
Initializing local connection to Ollama using model openhermes:latest...
Traceback (most recent call last):
File "/home/piotr/CrewAI/Tutorials/autocrew/core.py", line 120, in start_ollama_service
subprocess.check_output(["pgrep", "-f", "ollama serve"])
File "/home/piotr/anaconda3/envs/AutoCrewTutorials/lib/python3.12/subprocess.py", line 466, in check_output
return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/piotr/anaconda3/envs/AutoCrewTutorials/lib/python3.12/subprocess.py", line 571, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['pgrep', '-f', 'ollama serve']' returned non-zero exit status 1.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/piotr/CrewAI/Tutorials/autocrew/welcome.py", line 378, in <module>
main()
File "/home/piotr/CrewAI/Tutorials/autocrew/welcome.py", line 362, in main
autocrew = AutoCrew(config_path) # Pass the path to the config.ini file to the AutoCrew constructor
^^^^^^^^^^^^^^^^^^^^^
File "/home/piotr/CrewAI/Tutorials/autocrew/core.py", line 87, in __init__
self.ollama = self.initialize_ollama() if self.llm_endpoint == 'ollama' else None
^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/piotr/CrewAI/Tutorials/autocrew/core.py", line 105, in initialize_ollama
self.start_ollama_service()
File "/home/piotr/CrewAI/Tutorials/autocrew/core.py", line 125, in start_ollama_service
subprocess.Popen(["ollama", "serve"], start_new_session=True)
File "/home/piotr/anaconda3/envs/AutoCrewTutorials/lib/python3.12/subprocess.py", line 1026, in __init__
self._execute_child(args, executable, preexec_fn, close_fds,
File "/home/piotr/anaconda3/envs/AutoCrewTutorials/lib/python3.12/subprocess.py", line 1950, in _execute_child
raise child_exception_type(errno_num, err_msg, err_filename)
FileNotFoundError: [Errno 2] No such file or directory: 'ollama'
I have Win11pro, WSL2 and Ubuntu22.04. Autocrew - freshly start with cleans Ubuntu.
Hi, script is becoming a separate valuable system now. Great!! I just found issue. I have Ollama server as separate instance. Its accessible in default port but during crew creation it turns out its looking for ollama folder. 👍
I have Win11pro, WSL2 and Ubuntu22.04. Autocrew - freshly start with cleans Ubuntu.