Closed abceleung closed 4 months ago
@abceleung I think so! My suggestion would be to pick a different model here (https://github.com/nkasmanoff/pi-card/blob/main/config.py#L16) as this one takes up a lot of RAM, but if you switch to something even smaller, or use a different quantization style which reduces the memory required even more, I think this would work.
For more models: https://ollama.com/library
I have a RPi 4B (4GB) and is wondering if I can use it to build this project. Thanks!