AutoGPT is the vision of accessible AI for everyone, to use and to build on. Our mission is to provide the tools, so that you can focus on what matters.
[X] I have searched the existing issues, and there is no existing issue for my problem
Which Operating System are you using?
Linux
Which version of AutoGPT are you using?
Stable (branch)
What LLM Provider do you use?
Other (detail in issue)
Which area covers your issue best?
Other
What commit or version are you using?
2618d1d87cd04623c848df870800f328fe36bc83
Describe your issue.
The documentation clearly states the possibility to use ollama, via running a model in ollama, then starting the server, starting the builder and then select the last option in the model list in the blocks. The model list always stays the same though, whatever model I run in ollama and the last model in the list is llama3.1:405b which it says in the terminal I don't have, which is correct.
Does that mean that ollama running is not detected by AutoGPT? As the list does not show anything else whatever model I run. Or does the one referring to the model running in ollama not get renamed? Or do I miss something else possibly?
I found a way to get it working, by editing the llm.py block file. That means I have to add every model I wish to run via ollama by hand each time I get another one, right?
⚠️ Search for existing issues first ⚠️
Which Operating System are you using?
Linux
Which version of AutoGPT are you using?
Stable (branch)
What LLM Provider do you use?
Other (detail in issue)
Which area covers your issue best?
Other
What commit or version are you using?
2618d1d87cd04623c848df870800f328fe36bc83
Describe your issue.
The documentation clearly states the possibility to use ollama, via running a model in ollama, then starting the server, starting the builder and then select the last option in the model list in the blocks. The model list always stays the same though, whatever model I run in ollama and the last model in the list is llama3.1:405b which it says in the terminal I don't have, which is correct.
Does that mean that ollama running is not detected by AutoGPT? As the list does not show anything else whatever model I run. Or does the one referring to the model running in ollama not get renamed? Or do I miss something else possibly?
Upload Activity Log Content
No response
Upload Error Log Content
No response