Mozilla-Ocho / llamafile

Distribute and run LLMs with a single file.
https://llamafile.ai
Other
20.53k stars 1.03k forks source link

UX Request: Update readme to mention `llamafile -m foo.llamafile` as an option #511

Open mofosyne opened 3 months ago

mofosyne commented 3 months ago

Prerequisites

Feature Description

Add mention that -m can run other llamafiles

Motivation

In windows, the 4gig limit means you need to stick under the limit as an executable. A llamafile can be more than that.

Need to update readme to mention you can use a standalone llamafile engine without any weight and use that to bootstrap run a normal llamafile via llamafile -m foo.llamafile

Searched readme.md if it was mentioned, but didn't find one.

Possible Implementation

No response

jart commented 3 months ago

Pull requests welcome!