Closed LachsBagel closed 1 month ago
Currently the agents docker container runs the functionary model and the host OS directly has ollama to run the llama3 model.
Investigate whether llama3.1 8B can replace both of the above models by running on the host OS via ollama and then removing functionary from the agents container. Ensuring the agents container's agents still retain the same functionality as they presently do.
Appears that llama3.1 8B wouldn't be sufficient for tool use. Minimum would be 70B, so sticking with functionary for now is best.
This will allow for downloading one set of model parameters instead of two. Reducing download and installation time for users.