Is it possible to integrate this with the ollama model server? I tend to expose LLMs through ollama to various applications that are able to talk to that. But I couldn't see easily how to get ollama to use the airllm version of a model so that I can run the larger models locally and access them through the ollama server. Cheers!
Is it possible to integrate this with the ollama model server? I tend to expose LLMs through ollama to various applications that are able to talk to that. But I couldn't see easily how to get ollama to use the airllm version of a model so that I can run the larger models locally and access them through the ollama server. Cheers!