Closed D-314 closed 2 months ago
In addition to the python library, ollama itself and the corresponding model must be installed:
pip install ollama
sudo curl -fsSL https://ollama.com/install.sh | sh
ollama pull mistral-nemo:12b-instruct-2407-q6_K
ollama run mistral-nemo:12b-instruct-2407-q6_K
This model requires an Nvidia GPU with >12GB of video memory. For example, RTX3060.
OS: Win11 The same error on Linux, but with a different error code: