Doriandarko / maestro

A framework for Claude Opus to intelligently orchestrate subagents.
4.13k stars 639 forks source link

Bug fix Download local language model #18

Open start-life opened 5 months ago

start-life commented 5 months ago

C:\Users\z5050>ollama.pull('llama3:70b') 'ollama.pull' is not recognized as an internal or external command, operable program or batch file.

It should be a different command

ollama run llama3 ollama run llama3:70b This command works fix it

EyeSeeThru commented 3 months ago

FYI, you can skip this step entirely. You can pull the models via terminal with Ollama normally with the 'ollama pull [model]' command, or use any previously-installed Ollama models you have. Just enter the model names you have installed in the maestro-ollama.py file in the corresponding roles, and it will detect the models as long as you have already run the 'pip install ollama' command.