[X] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
[X] I reviewed the Discussions, and have a new and useful enhancement to share.
Feature Description
Add mention that -m can run other llamafiles
Motivation
In windows, the 4gig limit means you need to stick under the limit as an executable. A llamafile can be more than that.
Need to update readme to mention you can use a standalone llamafile engine without any weight and use that to bootstrap run a normal llamafile via llamafile -m foo.llamafile
Searched readme.md if it was mentioned, but didn't find one.
Prerequisites
Feature Description
Add mention that
-m
can run otherllamafiles
Motivation
In windows, the 4gig limit means you need to stick under the limit as an executable. A llamafile can be more than that.
Need to update readme to mention you can use a standalone llamafile engine without any weight and use that to bootstrap run a normal llamafile via
llamafile -m foo.llamafile
Searched readme.md if it was mentioned, but didn't find one.
Possible Implementation
No response