I don't have the bandwidth/time to make this a full PR (particularly the unknown model support implies better config being required), but putting this up in case it's helpful (particularly the breaking llama.cpp upgrade).
You can also test this on unraid directly by replacing the repository of an existing install in unraid with ghcr.io/raiju/open-chat-cuda:v2.0.0-alpha.6.
I don't have the bandwidth/time to make this a full PR (particularly the unknown model support implies better config being required), but putting this up in case it's helpful (particularly the breaking llama.cpp upgrade).
You can also test this on unraid directly by replacing the repository of an existing install in unraid with
ghcr.io/raiju/open-chat-cuda:v2.0.0-alpha.6
.