Open haesleinhuepf opened 2 weeks ago
Hi @haesleinhuepf, sadly, it is not working on my Laptop. Seems like I don't have enough Memory to do so [Error: model requires more system memory (11.4 GiB) than is available (9.9 GiB)]..
Ok, then it might be worth exploring how to run it on clara/paula in the compute center... ?
Hi @lea-33 ,
how about introducing another LLM endpoint: ollama? There were recently new vision-models published, namely llama3.2-vision 11B and 90B. Does the 11B version work in your environment? If so, we could overcome rate-limits by using local models.