Describe the bug
Detect and inform user about unmet system requirements for running a model, like the 200GB+ RAM for LLama 3.1 405B
level=WARN source=server.go:136 msg="model request too large for system" requested="209.7 GiB" available=12541681664 total="14.9 GiB" free="8.5 GiB" swap="3.1 GiB"
Expected behavior
Currently it just shows the error message "There was an error with the local Ollama instance, so it has been reset" , please give an explanation within the gui
Describe the bug Detect and inform user about unmet system requirements for running a model, like the 200GB+ RAM for LLama 3.1 405B
level=WARN source=server.go:136 msg="model request too large for system" requested="209.7 GiB" available=12541681664 total="14.9 GiB" free="8.5 GiB" swap="3.1 GiB"
Expected behavior Currently it just shows the error message "There was an error with the local Ollama instance, so it has been reset" , please give an explanation within the gui
Screenshots
Debugging information