ollama / ollama-python

Ollama Python library
https://ollama.com
MIT License
2.71k stars 223 forks source link

Can I set num_ctx=-1 to use max possible context window? #84

Closed eliranwong closed 3 months ago

eliranwong commented 3 months ago

Can I set num_ctx=-1 to use max possible context window?

so I don't need to change its value for different models

jmorganca commented 3 months ago

Hi there, you can set it to a large value (e.g. 16k, 1M) and Ollama will automatically use the largest context window the model was trained against. Hope this helps

eliranwong commented 3 months ago

Your suggestion does not work in some models I tested. It just hangs forever without response, if I set num_ctx too high.