Closed jcoombes closed 3 months ago
Ollama Integration doesn't allow setting num_ctx in ollama
^ The default value of num_ctx in the API is 2k, and I would like to run llama3:70b with num_ctx 8192.
https://docs.crewai.com/how-to/LLM-Connections/
closing this - I fixed this by using a MODELFILE where I specified the num_ctx I wanted. This is an ollama feature, not a crewai feature.
Ollama Integration doesn't allow setting num_ctx in ollama
^ The default value of num_ctx in the API is 2k, and I would like to run llama3:70b with num_ctx 8192.
https://docs.crewai.com/how-to/LLM-Connections/