Closed joonas closed 2 months ago
Awesome idea!
I've reworked this a bit to include the environment variables by default and using the tarball-based distribution instead of just downloading the binary as before.
LGTM. Thanks! Note that
ollama.service
won't start automatically when the sysext will be loaded. If it's not an issue, we can go ahead. :)
hmm, that seems like it probably should happen on boot, I'll take a look at changing that
LGTM. Thanks! Note that
ollama.service
won't start automatically when the sysext will be loaded. If it's not an issue, we can go ahead. :)hmm, that seems like it probably should happen on boot, I'll take a look at changing that
When a service has to be started when the sysext is loaded, we usually add a drop-in to multi-user.target
: https://github.com/flatcar/sysext-bakery/blob/82e4914481f88f09923fe6b25700c1905e650d21/create_wasmcloud_sysext.sh#L92-L94
@tormath1 thanks for the pointer, I thought I had originally already included that, but looks like I'd overlooked it.
Should be good to go now 🙂
Add Ollama sysext
This creates a sysext for running Ollama on Flatcar.
How to use
Create a Flatcar instance with configuration that looks roughly something like this:
Testing done
I've deployed a DigitalOcean droplet with the above configuration and verified that the expected version of Ollama was available and could be used to inference (following steps from Ollama vision models blog post):
changelog/
directory (user-facing change, bug fix, security fix, update)/boot
and/usr
size, packages, list files for any missing binaries, kernel modules, config files, kernel modules, etc.